NiaPy.algorithms
¶
Module with implementations of basic and hybrid algorithms.
-
class
NiaPy.algorithms.
Algorithm
(**kwargs)[source]¶ Bases:
object
Class for implementing algorithms.
- Date:
2018
- Author
Klemen Berkovič
- License:
MIT
- Variables
Rand (mtrand.RandomState) – Random generator.
InitPopFunc (Callable[[int, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]) – Idividual initialization function.
itype (Individual) – Type of individuals used in population, default value is None for Numpy arrays.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
InitPopFunc
(NP, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **kwargs)¶ Initialize starting population that is represented with numpy.ndarray with shape {NP, task.D}.
- Parameters
- Returns
New population with shape {NP, task.D}.
New population function/fitness values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
-
NP
= 50¶
-
Name
= ['Algorithm', 'AAA']¶
-
Rand
= RandomState(MT19937) at 0x7F0B64D00678¶
-
__init__
(**kwargs)[source]¶ Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
bad_run
()[source]¶ Check if some exeptions where thrown when the algorithm was running.
- Returns
True if some error where detected at runtime of the algorithm, otherwise False
- Return type
-
getBest
(X, X_f, xb=None, xb_f=inf)[source]¶ Get the best individual for population.
- Parameters
X (numpy.ndarray) – Current population.
X_f (numpy.ndarray) – Current populations fitness/function values of aligned individuals.
xb (numpy.ndarray) – Best individual.
xb_f (float) – Fitness value of best individual.
- Returns
Coordinates of best solution.
beset fitness/function value.
- Return type
Tuple[numpy.ndarray, float]
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
initPopulation
(task)[source]¶ Initialize starting population of optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
New population.
New population fitness values.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
itype
= None¶
-
normal
(loc, scale, D=None)[source]¶ Get normal random distribution of shape D with mean “loc” and standard deviation “scale”.
-
randint
(Nmax, D=1, Nmin=0, skip=None)[source]¶ Get discrete uniform (integer) random distribution of D shape in range from “Nmin” to “Nmax”.
- Parameters
- Returns
Random generated integer number.
- Return type
-
run
(task)[source]¶ Start the optimization.
- Parameters
task (Task) – Optimization task.
- Returns
Best individuals components found in optimization process.
Best fitness value found in optimization process.
- Return type
Tuple[numpy.ndarray, float]
See also
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core functionality of algorithm.
This function is called on every algorithm iteration.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population coordinates.
fpop (numpy.ndarray) – Current population fitness value.
xb (numpy.ndarray) – Current generation best individuals coordinates.
xb_f (float) – current generation best individuals fitness value.
**dparams (Dict[str, Any]) – Additional arguments for algorithms.
- Returns
New populations coordinates.
New populations fitness values.
New global best position/solution
New global best fitness/objective value
Additional arguments of the algorithm.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
runTask
(task)[source]¶ Start the optimization.
- Parameters
task (Task) – Task with bounds and objective function for optimization.
- Returns
Best individuals components found in optimization process.
Best fitness value found in optimization process.
- Return type
Tuple[numpy.ndarray, float]
-
runYield
(task)[source]¶ Run the algorithm for a single iteration and return the best solution.
- Parameters
task (Task) – Task with bounds and objective function for optimization.
- Returns
Generator getting new/old optimal global values.
- Return type
- Yields
Tuple[numpy.ndarray, float] – 1. New population best individuals coordinates. 2. Fitness value of the best solution.
-
setParameters
(NP=50, InitPopFunc=<function defaultNumPyInit>, itype=None, **kwargs)[source]¶ Set the parameters/arguments of the algorithm.
- Parameters
NP (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
InitPopFunc (Optional[Callable[[int, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]]) – Type of individuals used by algorithm.
itype (Optional[Any]) – Individual type used in population, default is Numpy array.
**kwargs (Dict[str, Any]) – Additional arguments.
-
static
typeParameters
()[source]¶ Return functions for checking values of parameters.
- Returns
NP (Callable[[int], bool]): Check if number of individuals is \(\in [0, \infty]\).
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.
AlgorithmUtility
[source]¶ Bases:
object
Base class with string mappings to algorithms.
Initialize the algorithms.
-
class
NiaPy.algorithms.
Individual
(x=None, task=None, e=True, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **kwargs)[source]¶ Bases:
object
Class that represents one solution in population of solutions.
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Initialize new individual.
- Parameters
-
__eq__
(other)[source]¶ Compare the individuals for equalities.
- Parameters
other (Union[Any, numpy.ndarray]) – Object that we want to compare this object to.
- Returns
True if equal or False if no equal.
- Return type
-
__getitem__
(i)[source]¶ Get the value of i-th component of the solution.
- Parameters
i (int) – Position of the solution component.
- Returns
Value of ith component.
- Return type
Any
-
__init__
(x=None, task=None, e=True, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **kwargs)[source]¶ Initialize new individual.
- Parameters
-
__len__
()[source]¶ Get the length of the solution or the number of components.
- Returns
Number of components.
- Return type
-
__setitem__
(i, v)[source]¶ Set the value of i-th component of the solution to v value.
- Parameters
i (int) – Position of the solution component.
v (Any) – Value to set to i-th component.
-
__str__
()[source]¶ Print the individual with the solution and objective value.
- Returns
String representation of self.
- Return type
-
copy
()[source]¶ Return a copy of self.
Method returns copy of
this
object so it is safe for editing.- Returns
Copy of self.
- Return type
-
evaluate
(task, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]¶ Evaluate the solution.
Evaluate solution
this.x
with the help of task. Task is used for reparing the solution and then evaluating it.- Parameters
task (Task) – Objective function object.
rnd (Optional[mtrand.RandomState]) – Random generator.
See also
NiaPy.util.Task.repair()
-
f
= inf¶
-
generateSolution
(task, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]¶ Generate new solution.
Generate new solution for this individual and set it to
self.x
. This method usesrnd
for getting random numbers. For generating random componentsrnd
andtask
is used.- Parameters
task (Task) – Optimization task.
rnd (Optional[mtrand.RandomState]) – Random numbers generator object.
-
x
= None¶
-
NiaPy.algorithms.
defaultIndividualInit
(task, NP, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, itype=None, **kwargs)[source]¶ Initialize NP individuals of type itype.
- Parameters
task (Task) – Optimization task.
NP (int) – Number of individuals in population.
rnd (Optional[mtrand.RandomState]) – Random number generator.
itype (Optional[Individual]) – Class of individual in population.
kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Initialized individuals.
Initialized individuals function/fitness values.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float]
-
NiaPy.algorithms.
defaultNumPyInit
(task, NP, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **kwargs)[source]¶ Initialize starting population that is represented with numpy.ndarray with shape {NP, task.D}.
- Parameters
- Returns
New population with shape {NP, task.D}.
New population function/fitness values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
NiaPy.algorithms.basic
¶
Implementation of basic nature-inspired algorithms.
-
class
NiaPy.algorithms.basic.
AgingNpDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential evolution algorithm with aging individuals.
- Algorithm:
Differential evolution algorithm with dynamic population size that is defined by the quality of population
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – list of strings representing algorithm names.
Lt_min (int) – Minimal age of individual.
Lt_max (int) – Maximal age of individual.
delta_np (float) – Proportion of how many individuals shall die.
omega (float) – Acceptance rate for individuals to die.
mu (int) – Mean of individual max and min age.
age (Callable[[int, int, float, float, float, float, float], int]) – Function for calculation of age for individual.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AgingNpDifferentialEvolution', 'ANpDE']¶
-
aging
(task, pop)[source]¶ Apply aging to individuals.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray[Individual]) – Current population.
- Returns
New population.
- Return type
numpy.ndarray[Individual]
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
popDecrement
(pop, task)[source]¶ Decrement population.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
- Returns
Decreased population.
- Return type
numpy.ndarray[Individual]
-
popIncrement
(pop, task)[source]¶ Increment population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
task (Task) – Optimization task.
- Returns
Increased population.
- Return type
numpy.ndarray[Individual]
-
postSelection
(pop, task, xb, fxb, **kwargs)[source]¶ Post selection operator.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (Individual) – Global best individual.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New global best solution
New global best solutions fitness/objective value
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
selection
(pop, npop, xb, fxb, task, **kwargs)[source]¶ Select operator for individuals with aging.
- Parameters
- Returns
New population of individuals.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
class
NiaPy.algorithms.basic.
AgingNpMultiMutationDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.AgingNpDifferentialEvolution
,NiaPy.algorithms.basic.de.MultiStrategyDifferentialEvolution
Implementation of Differential evolution algorithm with aging individuals.
- Algorithm:
Differential evolution algorithm with dynamic population size that is defined by the quality of population
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AgingNpMultiMutationDifferentialEvolution', 'ANpMSDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve current population.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population of individuals.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
ArtificialBeeColonyAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Artificial Bee Colony algorithm.
- Algorithm:
Artificial Bee Colony algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Karaboga, D., and Bahriye B. “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm.” Journal of global optimization 39.3 (2007): 459-471.
- Arguments
Name (List[str]): List containing strings that represent algorithm names Limit (Union[float, numpy.ndarray[float]]): Limt
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
CalculateProbs
(Foods, Probs)[source]¶ Calculate the probes.
- Parameters
Foods (numpy.ndarray) – TODO
Probs (numpy.ndarray) – TODO
- Returns
TODO
- Return type
numpy.ndarray
-
Name
= ['ArtificialBeeColonyAlgorithm', 'ABC']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
runIteration
(task, Foods, fpop, xb, fxb, Probs, Trial, **dparams)[source]¶ Core funciton of the algorithm.
- Parameters
task (Task) – Optimization task
Foods (numpy.ndarray) – Current population
fpop (numpy.ndarray[float]) – Function/fitness values of current population
xb (numpy.ndarray) – Current best individual
fxb (float) – Current best individual fitness/function value
Probs (numpy.ndarray) – TODO
Trial (numpy.ndarray) – TODO
dparams (Dict[str, Any]) – Additional parameters
- Returns
New population
New population fitness/function values
New global best solution
New global best fitness/objecive value
- Additional arguments:
Probes (numpy.ndarray): TODO
Trial (numpy.ndarray): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
BacterialForagingOptimizationAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of the Bacterial foraging optimization algorithm.
- Date:
2021
- Author:
Žiga Stupan
- License:
MIT
- Reference paper:
Passino, “Biomimicry of bacterial foraging for distributed optimization and control,” in IEEE Control Systems Magazine, vol. 22, no. 3, pp. 52-67, June 2002, doi: 10.1109/MCS.2002.1004010.
See also
Initialize algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['BacterialForagingOptimizationAlgorithm', 'BFOA', 'BFO']¶
-
__init__
(**kwargs)[source]¶ Initialize algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
initPopulation
(task)[source]¶ Initialize starting population of optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
New population.
New population fitness values.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
interaction
(cell, population)[source]¶ Compute cell to cell interaction J_cc.
- Parameters
cell (Cell) – Cell to compute interaction for.
population (numpy.ndarray[Cell]) – Population
- Returns
Cell to cell interaction J_cc
- Return type
-
random_direction
(dimension)[source]¶ Generate a random direction vector.
- Parameters
dimension (int) – Problem dimension
- Returns
Normalised random direction vector
- Return type
numpy.ndarray
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Bacterial Foraging Optimization algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals function/fitness value.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution,
New global best solutions fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(NP=50, n_chemotactic=100, n_swim=4, n_reproduction=4, n_elimination=2, prob_elimination=0.25, step_size=0.1, d_attract=0.1, w_attract=0.2, h_repel=0.1, w_repel=10.0, **kwargs)[source]¶ Set the parameters/arguments of the algorithm.
- Parameters
NP (Optional[int]) – Number of individuals in population \(\in [1, \infty]\).
n_chemotactic (Optional[int]) – Number of chemotactic steps.
n_swim (Optional[int]) – Number of swim steps.
n_reproduction (Optional[int]) – Number of reproduction steps.
n_elimination (Optional[int]) – Number of elimination and dispersal steps.
prob_elimination (Optional[float]) – Probability of a bacterium being eliminated and a new one being created at a random location in the search space.
step_size (Optional[float]) – Size of a chemotactic step.
d_attract (Optional[float]) – Depth of the attractant released by the cell (a quantification of how much attractant is released).
w_attract (Optional[float]) – Width of the attractant signal (a quantification of the diffusion rate of the chemical).
h_repel (Optional[float]) – Height of the repellant effect (magnitude of its effect).
w_repel (Optional[float]) – Width of the repellant.
**kwargs (Dict[str, Any]) – Additional arguments.
-
class
NiaPy.algorithms.basic.
BareBonesFireworksAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Bare Bones Fireworks Algorithm.
- Algorithm:
Bare Bones Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S1568494617306609
- Reference paper:
Junzhi Li, Ying Tan, The bare bones fireworks algorithm: A minimalist global optimizer, Applied Soft Computing, Volume 62, 2018, Pages 454-462, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2017.10.046.
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['BareBonesFireworksAlgorithm', 'BBFWA']¶
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
runIteration
(task, x, x_fit, xb, fxb, A, **dparams)[source]¶ Core function of Bare Bones Fireworks Algorithm.
- Parameters
task (Task) – Optimization task.
x (numpy.ndarray) – Current solution.
x_fit (float) – Current solution fitness/function value.
xb (numpy.ndarray) – Current best solution.
fxb (float) – Current best solution fitness/function value.
A (numpy.ndarray) – Serach range.
dparams (Dict[str, Any]) – Additional parameters.
- Returns
New solution.
New solution fitness/function value.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
A (numpy.ndarray): Serach range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
BatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Bat algorithm.
- Algorithm:
Bat algorithm
- Date:
2015
- Authors:
Iztok Fister Jr., Marko Burjek and Klemen Berkovič
- License:
MIT
- Reference paper:
Yang, Xin-She. “A new metaheuristic bat-inspired algorithm.” Nature inspired cooperative strategies for optimization (NICSO 2010). Springer, Berlin, Heidelberg, 2010. 65-74.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['BatAlgorithm', 'BA']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task)[source]¶ Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
S (numpy.ndarray): Solutions
Q (numpy.ndarray[float]): Frequencies
v (numpy.ndarray[float]): Velocities
- Return type
-
localSearch
(best, task, **kwargs)[source]¶ Improve the best solution according to the Yang (2010).
- Parameters
best (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
-
runIteration
(task, Sol, Fitness, xb, fxb, S, Q, v, **dparams)[source]¶ Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
Sol (numpy.ndarray) – Current population
Fitness (numpy.ndarray[float]) – Current population fitness/funciton values
best (numpy.ndarray) – Current best individual
f_min (float) – Current best individual function/fitness value
S (numpy.ndarray) – Solutions
Q (numpy.ndarray) – Frequencies
v (numpy.ndarray) – Velocities
best – Global best used by the algorithm
f_min – Global best fitness value used by the algorithm
dparams (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function vlues
New global best solution
New global best fitness/objective value
- Additional arguments:
S (numpy.ndarray): Solutions
Q (numpy.ndarray): Frequencies
v (numpy.ndarray): Velocities
best (numpy.ndarray): Global best
f_min (float): Global best fitness
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(NP=40, A=0.5, r=0.5, Qmin=0.0, Qmax=2.0, **ukwargs)[source]¶ Set the parameters of the algorithm.
-
static
typeParameters
()[source]¶ Return dict with where key of dict represents parameter name and values represent checking functions for selected parameter.
- Returns
A (Callable[[Union[float, int]], bool]): Loudness.
r (Callable[[Union[float, int]], bool]): Pulse rate.
Qmin (Callable[[Union[float, int]], bool]): Minimum frequency.
Qmax (Callable[[Union[float, int]], bool]): Maximum frequency.
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.basic.
BeesAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Bees algorithm.
- Algorithm:
The Bees algorithm
- Date:
2019
- Authors:
Rok Potočnik
- License:
MIT
- Reference paper:
DT Pham, A Ghanbarzadeh, E Koc, S Otri, S Rahim, and M Zaidi. The bees algorithm-a novel tool for complex optimisation problems. In Proceedings of the 2nd Virtual International Conference on Intelligent Production Machines and Systems (IPROMS 2006), pages 454–459, 2006
- Variables
NP (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sitest parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
ukwargs (Dict[str, Any]) – Additional arguments.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['BeesAlgorithm', 'BEA']¶
-
static
algorithmInfo
()[source]¶ Get information about algorithm.
- Returns
Algorithm information
- Return type
-
initPopulation
(task)[source]¶ Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
repair
(x, lower, upper)[source]¶ Truncate exceeded dimensions to the limits.
- Parameters
x (numpy.ndarray) – Individual to repair.
lower (numpy.ndarray) – Lower limits for dimensions.
upper (numpy.ndarray) – Upper limits for dimensions.
- Returns
Repaired individual.
- Return type
numpy.ndarray
-
runIteration
(task, BeesPosition, BeesCost, xb, fxb, ngh, **dparams)[source]¶ Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
BeesPosition (numpy.ndarray[float]) – Current population.
BeesCost (numpy.ndarray[float]) – Current population function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individual fitness/function value.
ngh (float) – A small value used for patches.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best fitness/objective value.
- Additional arguments:
ngh (float): A small value used for patches.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(NP=40, m=5, e=4, ngh=1, nep=4, nsp=2, **ukwargs)[source]¶ Set the parameters of the algorithm.
- Parameters
NP (Optional[int]) – Number of scout bees parameter.
m (Optional[int]) – Number of sites selected out of n visited sites parameter.
e (Optional[int]) – Number of best sites out of m selected sitest parameter.
nep (Optional[int]) – Number of bees recruited for best e sites parameter.
nsp (Optional[int]) – Number of bees recruited for the other selected sites parameter.
ngh (Optional[float]) – Initial size of patches parameter.
ukwargs (Dict[str, Any]) – Additional arguments.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
NP (Callable[[int], bool]): Checks if number of bees parameter has a proper value.
m (Callable[[int], bool]): Checks if number of selected sites parameter has a proper value.
e (Callable[[int], bool]): Checks if number of elite selected sites parameter has a proper value.
nep (Callable[[int], bool]): Checks if number of elite bees parameter has a proper value.
nsp (Callable[[int], bool]): Checks if number of other bees parameter has a proper value.
ngh (Callable[[float], bool]): Checks if size of patches parameter has a proper value.
- Return type
Dict[str, Callable]
See also
NiaPy.algorithms.algorithm.Algorithm.typeParameters()
-
class
NiaPy.algorithms.basic.
CamelAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Camel traveling behavior.
- Algorithm:
Camel algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Ali, Ramzy. (2016). Novel Optimization Algorithm Inspired by Camel Traveling Behavior. Iraq J. Electrical and Electronic Engineering. 12. 167-177.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CamelAlgorithm', 'CA']¶
-
static
algorithmInfo
()[source]¶ Get information about algorithm.
- Returns
Algorithm information
- Return type
-
initPop
(task, NP, rnd, itype, **kwargs)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task.
NP (int) – Number of camels in population.
rnd (mtrand.RandomState) – Random number generator.
itype (Individual) – Individual type.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Initialize population of camels.
Initialized populations function/fitness values.
- Return type
Tuple[numpy.ndarray[Camel], numpy.ndarray[float]]
-
lifeCycle
(c, mu, task)[source]¶ Apply life cycle to Camel.
- Parameters
c (Camel) – Camel to apply life cycle.
mu (float) – Vision range of camel.
task (Task) – Optimization task.
- Returns
Camel with life cycle applyed to it.
- Return type
Camel
-
runIteration
(task, caravan, fcaravan, cb, fcb, **dparams)[source]¶ Core function of Camel Algorithm.
- Parameters
task (Task) – Optimization task.
caravan (numpy.ndarray[Camel]) – Current population of Camels.
fcaravan (numpy.ndarray[float]) – Current population fitness/function values.
cb (Camel) – Current best Camel.
fcb (float) – Current best Camel fitness/function value.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population
New population function/fitness value
New global best solution
New global best fitness/objective value
Additional arguments
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, folat, dict]
-
setParameters
(NP=50, omega=0.25, mu=0.5, alpha=0.5, S_init=10, E_init=10, T_min=- 10, T_max=10, **ukwargs)[source]¶ Set the arguments of an algorithm.
- Parameters
NP (Optional[int]) – Population size \(\in [1, \infty)\).
T_min (Optional[float]) – Minimum temperature, must be true \($T_{min} < T_{max}\).
T_max (Optional[float]) – Maximum temperature, must be true \(T_{min} < T_{max}\).
omega (Optional[float]) – Burden factor \(\in [0, 1]\).
mu (Optional[float]) – Dying rate \(\in [0, 1]\).
S_init (Optional[float]) – Initial supply \(\in (0, \infty)\).
E_init (Optional[float]) – Initial endurance \(\in (0, \infty)\).
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
omega (Callable[[Union[int, float]], bool])
mu (Callable[[float], bool])
alpha (Callable[[float], bool])
S_init (Callable[[Union[float, int]], bool])
E_init (Callable[[Union[float, int]], bool])
T_min (Callable[[Union[float, int], bool])
T_max (Callable[[Union[float, int], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.basic.
CatSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Cat swarm optimiization algorithm.
Algorithm: Cat swarm optimization
Date: 2019
Author: Mihael Baketarić
License: MIT
Reference paper: Chu, S. C., Tsai, P. W., & Pan, J. S. (2006). Cat swarm optimization. In Pacific Rim international conference on artificial intelligence (pp. 854-858). Springer, Berlin, Heidelberg..
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CatSwarmOptimization', 'CSO']¶
-
static
algorithmInfo
()[source]¶ Get algorithm information.
- Returns
Algorithm information.
- Return type
-
randomSeekTrace
()[source]¶ Set cats into seeking/tracing mode.
- Returns
One or zero. One means tracing mode. Zero means seeking mode. Length of list is equal to NP.
- Return type
numpy.ndarray
-
repair
(x, l, u)[source]¶ Repair array to range.
- Parameters
x (numpy.ndarray) – Array to repair.
l (numpy.ndarray) – Lower limit of allowed range.
u (numpy.ndarray) – Upper limit of allowed range.
- Returns
Repaired array.
- Return type
numpy.ndarray
-
runIteration
(task, pop, fpop, xb, fxb, velocities, modes, **dparams)[source]¶ Core function of Cat Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current best individual.
fxb (float) – Current best cat fitness/function value.
velocities (numpy.ndarray) – Velocities of individuals.
modes (numpy.ndarray) – Flag of each individual.
**dparams (Dict[str, Any]) – Additional function arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
Dictionary of modes (seek or trace) and velocities for each cat.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
seekingMode
(task, cat, fcat, pop, fpop, fxb)[source]¶ Seeking mode.
- Parameters
task (Task) – Optimization task.
cat (numpy.ndarray) – Individual from population.
fcat (float) – Current individual’s fitness/function value.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current population fitness/function values.
fxb (float) – Current best cat fitness/function value.
- Returns
Updated individual’s position
Updated individual’s fitness/function value
Updated global best position
Updated global best fitness/function value
- Return type
-
setParameters
(NP=30, MR=0.1, C1=2.05, SMP=3, SPC=True, CDC=0.85, SRD=0.2, vMax=1.9, **ukwargs)[source]¶ Set the algorithm parameters.
- Parameters
NP (int) – Number of individuals in population.
MR (float) – Mixture ratio.
C1 (float) – Constant in tracing mode.
SMP (int) – Seeking memory pool.
SPC (bool) – Self-position considering.
CDC (float) – Decides how many dimensions will be varied.
SRD (float) – Seeking range of the selected dimension.
vMax (float) – Maximal velocity.
Also (See) –
-
tracingMode
(task, cat, velocity, xb)[source]¶ Tracing mode.
- Parameters
task (Task) – Optimization task.
cat (numpy.ndarray) – Individual from population.
velocity (numpy.ndarray) – Velocity of individual.
xb (numpy.ndarray) – Current best individual.
- Returns
Updated individual’s position
Updated individual’s fitness/function value
Updated individual’s velocity vector
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray]
-
class
NiaPy.algorithms.basic.
CenterParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Center Particle Swarm Optimization.
- Algorithm:
Center Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
H.-C. Tsai, Predicting strengths of concrete-type specimens using hybrid multilayer perceptrons with center-Unified particle swarm optimization, Adv. Eng. Softw. 37 (2010) 1104–1112.
See also
NiaPy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CenterParticleSwarmOptimization', 'CPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
fxb (numpy.ndarray) – Current global best particles function/fitness value.
**dparams – Additional arguments.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
- Return type
See also
NiaPy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.runIteration()
-
class
NiaPy.algorithms.basic.
ComprehensiveLearningParticleSwarmOptimizer
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Comprehensive Learning Particle Swarm Optimizer
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Liang, A. K. Qin, P. N. Suganthan and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” in IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281-295, June 2006. doi: 10.1109/TEVC.2005.857610
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1637688&isnumber=34326
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['ComprehensiveLearningParticleSwarmOptimizer', 'CLPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
init
(task)[source]¶ Initialize dynamic arguments of Particle Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
vMin: Mininal velocity.
vMax: Maximal velocity.
V: Initial velocity of particle.
flag: Refresh gap counter.
- Return type
Dict[str, np.ndarray]
-
runIteration
(task, pop, fpop, xb, fxb, popb, fpopb, vMin, vMax, V, flag, Pc, **dparams)[source]¶ Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current populations.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current best particle.
fxb (float) – Current best particle fitness/function value.
popb (numpy.ndarray) – Particles best position.
fpopb (numpy.ndarray) – Particles best positions fitness/function values.
vMin (numpy.ndarray) – Minimal velocity.
vMax (numpy.ndarray) – Maximal velocity.
V (numpy.ndarray) – Velocity of particles.
flag (numpy.ndarray) – Refresh rate counter.
Pc (numpy.ndarray) – Learning rate.
**dparams (Dict[str, Any]) – Additional function arguments.
- Returns
New population.
New population fitness/function values.
New global best position.
New global best positions function/fitness value.
- Additional arguments:
popb: Particles best population.
fpopb: Particles best positions function/fitness value.
vMin: Minimal velocity.
vMax: Maximal velocity.
V: Initial velocity of particle.
flag: Refresh gap counter.
Pc: Learning rate.
- Return type
Tuple[np.ndarray, np.ndarray, np.ndarray, dict]
-
setParameters
(m=10, w0=0.9, w1=0.4, C=1.49445, **ukwargs)[source]¶ Set Particle Swarm Algorithm main parameters.
-
updateVelocityCL
(V, p, pb, w, vMin, vMax, task, **kwargs)[source]¶ Update particle velocity.
- Parameters
V (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
w (numpy.ndarray) – Weights for velocity adjustment.
vMin (numpy.ndarray) – Minimal velocity allowed.
vMax (numpy.ndarray) – Maxmimal velocity allowed.
task (Task) – Optimization task.
kwargs – Additional arguments.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
CoralReefsOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Coral Reefs Optimization Algorithm.
- Algorithm:
Coral Reefs Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference Paper:
Salcedo-Sanz, J. Del Ser, I. Landa-Torres, S. Gil-López, and J. A. Portilla-Figueras, “The Coral Reefs Optimization Algorithm: A Novel Metaheuristic for Efficiently Solving Optimization Problems,” The Scientific World Journal, vol. 2014, Article ID 739768, 15 pages, 2014.
- Reference URL:
- Variables
Name (List[str]) – List of strings representing algorithm name.
phi (float) – Range of neighborhood.
Fa (int) – Number of corals used in asexsual reproduction.
Fb (int) – Number of corals used in brooding.
Fd (int) – Number of corals used in depredation.
k (int) – Nomber of trys for larva setting.
P_F (float) – Mutation variable \(\in [0, \infty]\).
P_Cr (float) – Crossover rate in [0, 1].
Distance (Callable[[numpy.ndarray, numpy.ndarray], float]) – Funciton for calculating distance between corals.
SexualCrossover (Callable[[numpy.ndarray, float, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray[float]]]) – Crossover function.
Brooding (Callable[[numpy.ndarray, float, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Brooding function.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CoralReefsOptimization', 'CRO']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
asexualReprodution
(Reef, Reef_f, xb, fxb, task)[source]¶ Asexual reproduction of corals.
- Parameters
Reef (numpy.ndarray) – Current population of reefs.
Reef_f (numpy.ndarray) – Current populations function/fitness values.
task (Task) – Optimization task.
- Returns
New population.
New population fitness/funciton values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
See also
NiaPy.algorithms.basic.BroodingSimple()
-
depredation
(Reef, Reef_f)[source]¶ Depredation operator for reefs.
- Parameters
Reef (numpy.ndarray) – Current reefs.
Reef_f (numpy.ndarray) – Current reefs function/fitness values.
- Returns
Best individual
Best individual fitness/function value
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO.
- Return type
Dict[str, Any]
-
runIteration
(task, Reef, Reef_f, xb, fxb, **dparams)[source]¶ Core function of Coral Reefs Optimization algorithm.
- Parameters
task (Task) – Optimization task.
Reef (numpy.ndarray) – Current population.
Reef_f (numpy.ndarray) – Current population fitness/function value.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solution fitness/function value.
**dparams – Additional arguments
- Returns
New population.
New population fitness/function values.
New global bset solution
New global best solutions fitness/objective value
Additional arguments:
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
See also
NiaPy.algorithms.basic.CoralReefsOptimization.SexualCrossover()
NiaPy.algorithms.basic.CoralReefsOptimization.Brooding()
-
setParameters
(N=25, phi=0.4, Fa=0.5, Fb=0.5, Fd=0.3, k=25, P_Cr=0.5, P_F=0.36, SexualCrossover=<function SexualCrossoverSimple>, Brooding=<function BroodingSimple>, Distance=<function euclidean>, **ukwargs)[source]¶ Set the parameters of the algorithm.
- Parameters
N (int) – population size for population initialization.
phi (int) – TODO.
Fa (float) – Value $in [0, 1]$ for Asexual reproduction size.
Fb (float) – Value $in [0, 1]$ for Brooding size.
Fd (float) – Value $in [0, 1]$ for Depredation size.
k (int) – Trys for larvae setting.
SexualCrossover (Callable[[numpy.ndarray, float, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Crossover function.
P_Cr (float) – Crossover rate $in [0, 1]$.
Brooding (Callable[[numpy.ndarray, float, Task, mtrand.RandomState, Dict[str, Any]], Tuple[numpy.ndarray, numpy.ndarray]]) – Brooding function.
P_F (float) – Crossover rate $in [0, 1]$.
Distance (Callable[[numpy.ndarray, numpy.ndarray], float]) – Funciton for calculating distance between corals.
-
setting
(X, X_f, Xn, Xn_f, xb, fxb, task)[source]¶ Operator for setting reefs.
New reefs try to seatle to selected position in search space. New reefs are successful if theyr fitness values is better or if they have no reef ocupying same search space.
- Parameters
X (numpy.ndarray) – Current population of reefs.
X_f (numpy.ndarray) – Current populations function/fitness values.
Xn (numpy.ndarray) – New population of reefs.
Xn_f (array of float) – New populations function/fitness values.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New seatled population.
New seatled population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float]
-
class
NiaPy.algorithms.basic.
CovarianceMatrixAdaptionEvolutionStrategy
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of (mu, lambda) evolution strategy algorithm. Algorithm is good for dynamic environments. Mu individual create lambda chields. Only best mu chields go to new generation. Mu parents are discarded.
- Algorithm:
(\(\mu + \lambda\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Hansen, Nikolaus. “The CMA evolution strategy: A tutorial.” arXiv preprint arXiv:1604.00772 (2016).
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CovarianceMatrixAdaptionEvolutionStrategy', 'CMA-ES', 'CMAES']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
epsilon
= 1e-20¶
-
class
NiaPy.algorithms.basic.
CrowdingDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential evolution algorithm with multiple mutation strateys.
- Algorithm:
Implementation of Differential evolution algorithm with multiple mutation strateys
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CrowdingDifferentialEvolution', 'CDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
class
NiaPy.algorithms.basic.
CuckooSearch
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Cuckoo behaviour and levy flights.
- Algorithm:
Cuckoo Search
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference:
Yang, Xin-She, and Suash Deb. “Cuckoo search via Lévy flights.” Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on. IEEE, 2009.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['CuckooSearch', 'CS']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
runIteration
(task, pop, fpop, xb, fxb, pa_v, **dparams)[source]¶ Core function of CuckooSearch algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individual function/fitness values.
pa_v (float) – TODO
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
Initialized population.
Initialized populations fitness/function values.
New global best solution
New global best solutions fitness/objective value
- Additional arguments:
pa_v (float): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
DifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Differential evolution algorithm.
- Algorithm:
Differential evolution algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Storn, Rainer, and Kenneth Price. “Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces.” Journal of global optimization 11.4 (1997): 341-359.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DifferentialEvolution', 'DE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve population.
- Parameters
pop (numpy.ndarray) – Current population.
xb (Individual) – Current best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New evolved populations.
- Return type
numpy.ndarray
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO
- Return type
Dict[str, Any]
-
postSelection
(pop, task, xb, fxb, **kwargs)[source]¶ Apply additional operation after selection.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best solution.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Differential Evolution algorithm.
-
selection
(pop, npop, xb, fxb, task, **kwargs)[source]¶ Operator for selection.
- Parameters
- Returns
New selected individuals.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
setParameters
(NP=50, F=1, CR=0.8, CrossMutt=<function CrossRand1>, **ukwargs)[source]¶ Set the algorithm parameters.
- Parameters
NP (Optional[int]) – Population size.
F (Optional[float]) – Scaling factor.
CR (Optional[float]) – Crossover rate.
CrossMutt (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, list], numpy.ndarray]]) – Crossover and mutation strategy.
ukwargs (Dict[str, Any]) – Additional arguments.
-
class
NiaPy.algorithms.basic.
DynNpDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Dynamic poulation size Differential evolution algorithm.
- Algorithm:
Dynamic poulation size Differential evolution algorithm
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpDifferentialEvolution', 'dynNpDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
postSelection
(pop, task, xb, fxb, **kwargs)[source]¶ Post selection operator.
In this algorithm the post selection operator decrements the population at specific iterations/generations.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Changed current population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
class
NiaPy.algorithms.basic.
DynNpMultiStrategyDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.MultiStrategyDifferentialEvolution
,NiaPy.algorithms.basic.de.DynNpDifferentialEvolution
Implementation of Dynamic population size Differential evolution algorithm with dynamic population size that is defined by the quality of population.
- Algorithm:
Dynamic population size Differential evolution algorithm with dynamic population size that is defined by the quality of population
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpMultiStrategyDifferentialEvolution', 'dynNpMsDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve the current population.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Global best solution.
task (Task) – Optimization task.
**kwargs (dict) – Additional arguments.
- Returns
Evolved new population.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
DynamicFireworksAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.fwa.DynamicFireworksAlgorithmGauss
Implementation of dynamic fireworks algorithm.
- Algorithm:
Dynamic Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223
- Reference paper:
Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynamicFireworksAlgorithm', 'dynFWA']¶
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
class
NiaPy.algorithms.basic.
DynamicFireworksAlgorithmGauss
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.fwa.EnhancedFireworksAlgorithm
Implementation of dynamic fireworks algorithm.
- Algorithm:
Dynamic Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223
- Reference paper:
Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Mapping
(x, task)[source]¶ Fix out of bound solution/individual.
- Parameters
x (numpy.ndarray) – Individual.
task (Task) – Optimization task.
- Returns
Fixed individual.
- Return type
numpy.ndarray
-
Name
= ['DynamicFireworksAlgorithmGauss', 'dynFWAG']¶
-
NextGeneration
(FW, FW_f, FWn, task)[source]¶ TODO.
- Parameters
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray[float]) – Current populations function/fitness values.
FWn (numpy.ndarray) – New population.
task (Task) – Optimization task.
- Returns
New population.
New populations function/fitness values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
initAmplitude
(task)[source]¶ Initialize amplitude.
- Parameters
task (Task) – Optimization task.
- Returns
Initial amplitudes.
Amplitude for best spark.
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
-
repair
(x, d, epsilon)[source]¶ Repair solution.
- Parameters
x (numpy.ndarray) – Individual.
d (numpy.ndarray) – Default value.
epsilon (float) – Limiting value.
- Returns
Fixed solution.
- Return type
numpy.ndarray
-
runIteration
(task, FW, FW_f, xb, fxb, Ah, Ab, **dparams)[source]¶ Core function of DynamicFireworksAlgorithmGauss algorithm.
- Parameters
task (Task) – Optimization task.
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray) – Current populations function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best fitness/function value.
Ah (Union[numpy.ndarray, float]) – TODO
Ab (Union[numpy.ndarray, float]) – TODO
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
Ah (Union[numpy.ndarray, float]): TODO
Ab (Union[numpy.ndarray, float]): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(A_cf=20, C_a=1.2, C_r=0.9, epsilon=1e-08, **ukwargs)[source]¶ Set core arguments of DynamicFireworksAlgorithmGauss.
- Parameters
See also
-
class
NiaPy.algorithms.basic.
EnhancedFireworksAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.fwa.FireworksAlgorithm
Implementation of enganced fireworks algorithm.
- Algorithm:
Enhanced Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Zheng, A. Janecek and Y. Tan, “Enhanced Fireworks Algorithm,” 2013 IEEE Congress on Evolutionary Computation, Cancun, 2013, pp. 2069-2077. doi: 10.1109/CEC.2013.6557813
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
GaussianSpark
(x, xb, task)[source]¶ Create new individual.
- Parameters
x (numpy.ndarray) –
xb (numpy.ndarray) –
task (Task) – Optimization task.
- Returns
New individual generated by gaussian noise.
- Return type
numpy.ndarray
-
Name
= ['EnhancedFireworksAlgorithm', 'EFWA']¶
-
NextGeneration
(FW, FW_f, FWn, task)[source]¶ Generate new population.
- Parameters
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray[float]) – Current populations fitness/function values.
FWn (numpy.ndarray) – New population.
task (Task) – Optimization task.
- Returns
New population.
New populations fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
initPopulation
(task)[source]¶ Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initial population.
Initial populations fitness/function values.
- Additional arguments:
Ainit (numpy.ndarray): Initial amplitude values.
Afinal (numpy.ndarray): Final amplitude values.
A_min (numpy.ndarray): Minimal amplitude values.
- Return type
See also
-
runIteration
(task, FW, FW_f, xb, fxb, Ah, Ainit, Afinal, A_min, **dparams)[source]¶ Core function of EnhancedFireworksAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray[float]) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals function/fitness value.
Ah (numpy.ndarray[float]) – Current amplitude.
Ainit (numpy.ndarray[float]) – Initial amplitude.
Afinal (numpy.ndarray[float]) – Final amplitude values.
A_min (numpy.ndarray[float]) – Minial amplitude values.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
Initial population.
Initial populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
Ainit (numpy.ndarray): Initial amplitude values.
Afinal (numpy.ndarray): Final amplitude values.
A_min (numpy.ndarray): Minimal amplitude values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(Ainit=20, Afinal=5, **ukwargs)[source]¶ Set EnhancedFireworksAlgorithm algorithms core parameters.
- Parameters
See also
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
Ainit (Callable[[Union[int, float]], bool]): TODO
Afinal (Callable[[Union[int, float]], bool]): TODO
- Return type
Dict[str, Callable]
See also
-
class
NiaPy.algorithms.basic.
EvolutionStrategy1p1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of (1 + 1) evolution strategy algorithm. Uses just one individual.
- Algorithm:
(1 + 1) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['EvolutionStrategy1p1', 'EvolutionStrategy(1+1)', 'ES(1+1)']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task)[source]¶ Initialize starting individual.
- Parameters
task (Task) – Optimization task.
- Returns
1, Initialized individual. 2, Initialized individual fitness/function value. 3. Additional arguments:
ki (int): Number of successful rho update.
- Return type
Tuple[Individual, float, Dict[str, Any]]
-
mutate
(x, rho)[source]¶ Mutate individual.
- Parameters
x (Individual) – Current individual.
rho (float) – Current standard deviation.
- Returns
Mutated individual.
- Return type
-
runIteration
(task, c, fpop, xb, fxb, ki, **dparams)[source]¶ Core function of EvolutionStrategy(1+1) algorithm.
- Parameters
task (Task) – Optimization task.
pop (Individual) – Current position.
fpop (float) – Current position function/fitness value.
xb (Individual) – Global best position.
fxb (float) – Global best function/fitness value.
ki (int) – Number of successful updates before rho update.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
1, Initialized individual. 2, Initialized individual fitness/function value. 3. New global best solution. 4. New global best soluitons fitness/objective value. 5. Additional arguments:
ki (int): Number of successful rho update.
- Return type
Tuple[Individual, float, Individual, float, Dict[str, Any]]
-
setParameters
(mu=1, k=10, c_a=1.1, c_r=0.5, epsilon=1e-20, **ukwargs)[source]¶ Set the arguments of an algorithm.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
mu (Callable[[int], bool])
k (Callable[[int], bool])
c_a (Callable[[Union[float, int]], bool])
c_r (Callable[[Union[float, int]], bool])
epsilon (Callable[[float], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.basic.
EvolutionStrategyML
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.es.EvolutionStrategyMpL
Implementation of (mu, lambda) evolution strategy algorithm. Algorithm is good for dynamic environments. Mu individual create lambda chields. Only best mu chields go to new generation. Mu parents are discarded.
- Algorithm:
(\(\mu + \lambda\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
See also
NiaPy.algorithm.basic.es.EvolutionStrategyMpL
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['EvolutionStrategyML', 'EvolutionStrategy(mu,lambda)', 'ES(m,l)']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
2. Initialized populations fitness/function values. 2. Additional arguments.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float], Dict[str, Any]]
See also
NiaPy.algorithm.basic.es.EvolutionStrategyMpL.initPopulation()
-
newPop
(pop)[source]¶ Return new population.
- Parameters
pop (numpy.ndarray) – Current population.
- Returns
New population.
- Return type
numpy.ndarray
-
runIteration
(task, c, fpop, xb, fxb, **dparams)[source]¶ Core function of EvolutionStrategyML algorithm.
- Parameters
task (Task) – Optimization task.
c (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals fitness/function value.
Dict[str (**dparams) – Additional arguments.
Any] – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
EvolutionStrategyMp1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.es.EvolutionStrategy1p1
Implementation of (mu + 1) evolution strategy algorithm. Algorithm creates mu mutants but into new generation goes only one individual.
- Algorithm:
(\(\mu + 1\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['EvolutionStrategyMp1', 'EvolutionStrategy(mu+1)', 'ES(m+1)']¶
-
class
NiaPy.algorithms.basic.
EvolutionStrategyMpL
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.es.EvolutionStrategy1p1
Implementation of (mu + lambda) evolution strategy algorithm. Mulation creates lambda individual. Lambda individual compete with mu individuals for survival, so only mu individual go to new generation.
- Algorithm:
(\(\mu + \lambda\)) Evolution Strategy Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['EvolutionStrategyMpL', 'EvolutionStrategy(mu+lambda)', 'ES(m+l)']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
changeCount
(c, cn)[source]¶ Update number of successful mutations for population.
- Parameters
c (numpy.ndarray[Individual]) – Current population.
cn (numpy.ndarray[Individual]) – New population.
- Returns
Number of successful mutations.
- Return type
-
initPopulation
(task)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized populaiton.
Initialized populations function/fitness values.
- Additional arguments:
ki (int): Number of successful mutations.
- Return type
Tuple[numpy.ndarray[Individual], numpy.ndarray[float], Dict[str, Any]]
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
mutateRand
(pop, task)[source]¶ Mutate random individual form population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
task (Task) – Optimization task.
- Returns
Random individual from population that was mutated.
- Return type
numpy.ndarray
-
runIteration
(task, c, fpop, xb, fxb, ki, **dparams)[source]¶ Core function of EvolutionStrategyMpL algorithm.
- Parameters
task (Task) – Optimization task.
c (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals fitness/function value.
ki (int) – Number of successful mutations.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
ki (int): Number of successful mutations.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(lam=45, **ukwargs)[source]¶ Set the arguments of an algorithm.
- Parameters
lam (int) – Number of new individual generated by mutation.
See also
NiaPy.algorithms.basic.es.EvolutionStrategy1p1.setParameters()
-
static
typeParameters
()[source]¶ TODO.
- Returns
lam (Callable[[int], bool]): TODO.
- Return type
Dict[str, Any]
-
updateRho
(pop, k)[source]¶ Update standard deviation for population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
k (int) – Number of successful mutations.
-
class
NiaPy.algorithms.basic.
FireflyAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Firefly algorithm.
- Algorithm:
Firefly algorithm
- Date:
2016
- Authors:
Iztok Fister Jr, Iztok Fister and Klemen Berkovič
- License:
MIT
- Reference paper:
Fister, I., Fister Jr, I., Yang, X. S., & Brest, J. (2013). A comprehensive review of firefly algorithms. Swarm and Evolutionary Computation, 13, 34-46.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['FireflyAlgorithm', 'FA']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
move_ffa
(i, Fireflies, Intensity, oFireflies, alpha, task)[source]¶ Move fireflies.
- Parameters
- Returns
New individual
True
if individual was moved,False
if individual was not moved
- Return type
Tuple[numpy.ndarray, bool]
-
runIteration
(task, Fireflies, Intensity, xb, fxb, alpha, **dparams)[source]¶ Core function of Firefly Algorithm.
- Parameters
task (Task) – Optimization task.
Fireflies (numpy.ndarray) – Current population.
Intensity (numpy.ndarray) – Current population function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individual fitness/function value.
alpha (float) – TODO.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution
New global best solutions fitness/objective value
- Additional arguments:
alpha (float): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
FireworksAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of fireworks algorithm.
- Algorithm:
Fireworks Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Tan, Ying. “Fireworks algorithm.” Heidelberg, Germany: Springer 10 (2015): 978-3
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
ExplodeSpark
(x, A, task)[source]¶ Explode a spark.
- Parameters
x (numpy.ndarray) – Individuals creating spark.
A (numpy.ndarray) – Amplitude of spark.
task (Task) – Optimization task.
- Returns
Sparks exploded in with specified amplitude.
- Return type
numpy.ndarray
-
GaussianSpark
(x, task)[source]¶ Create gaussian spark.
- Parameters
x (numpy.ndarray) – Individual creating a spark.
task (Task) – Optimization task.
- Returns
Spark exploded based on gaussian amplitude.
- Return type
numpy.ndarray
-
Mapping
(x, task)[source]¶ Fix value to bounds..
- Parameters
x (numpy.ndarray) – Individual to fix.
task (Task) – Optimization task.
- Returns
Individual in search range.
- Return type
numpy.ndarray
-
Name
= ['FireworksAlgorithm', 'FWA']¶
-
NextGeneration
(FW, FW_f, FWn, task)[source]¶ Generate new generation of individuals.
- Parameters
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray[float]) – Currents population fitness/function values.
FWn (numpy.ndarray) – New population.
task (Task) – Optimization task.
- Returns
New population.
New populations fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
-
R
(x, FW)[source]¶ Calculate ranges.
- Parameters
x (numpy.ndarray) – Individual in population.
FW (numpy.ndarray) – Current population.
- Returns
Ranges values.
- Return type
numpy,ndarray[float]
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
initAmplitude
(task)[source]¶ Initialize amplitudes for dimensions.
- Parameters
task (Task) – Optimization task.
- Returns
Starting amplitudes.
- Return type
numpy.ndarray[float]
-
initPopulation
(task)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness values.
- Additional arguments:
Ah (numpy.ndarray): Initialized amplitudes.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
runIteration
(task, FW, FW_f, xb, fxb, Ah, **dparams)[source]¶ Core function of Fireworks algorithm.
- Parameters
task (Task) – Optimization task.
FW (numpy.ndarray) – Current population.
FW_f (numpy.ndarray[float]) – Current populations function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals fitness/function value.
Ah (numpy.ndarray) – Current amplitudes.
**dparams (Dict[str, Any) – Additional arguments
- Returns
Initialized population.
Initialized populations function/fitness values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
Ah (numpy.ndarray): Initialized amplitudes.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
FishSchoolSearch
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Fish School Search algorithm.
- Algorithm:
Fish School Search algorithm
- Date:
2019
- Authors:
Clodomir Santana Jr, Elliackin Figueredo, Mariana Maceds, Pedro Santos. Ported to NiaPy with small changes by Kristian Järvenpää (2018). Ported to the NiaPy 2.0 by Klemen Berkovič (2019).
- License:
MIT
- Reference paper:
Bastos Filho, Lima Neto, Lins, D. O. Nascimento and P. Lima, “A novel search algorithm based on fish school behavior,” in 2008 IEEE International Conference on Systems, Man and Cybernetics, Oct 2008, pp. 2646–2651.
- Variables
Name (List[str]) – List of strings representing algorithm name.
SI_init (int) – Length of initial individual step.
SI_final (int) – Length of final individual step.
SV_init (int) – Length of initial volatile step.
SV_final (int) – Length of final volatile step.
min_w (float) – Minimum weight of a fish.
w_scale (float) – Maximum weight of a fish.
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['FSS', 'FishSchoolSearch']¶
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
calculate_barycenter
(school, task)[source]¶ Calculate barycenter of fish school.
- Parameters
school (numpy.ndarray) – Current school fish.
task (Task) – Optimization task.
- Returns
TODO.
- Return type
numpy.ndarray
-
collective_instinctive_movement
(school, task)[source]¶ Perform collective instinctive movement.
- Parameters
school (numpy.ndarray) – Current population.
task (Task) – Optmization task.
- Returns
New populaiton
- Return type
numpy.ndarray
-
collective_volitive_movement
(school, curr_step_volitive, prev_weight_school, curr_weight_school, xb, fxb, task)[source]¶ Perform collective volitive movement.
- Parameters
school (numpy.ndarray) –
curr_step_volitive –
prev_weight_school –
curr_weight_school –
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/objective value.
task (Task) – Optimization task.
- Returns
New population.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
feeding
(school)[source]¶ Feed all fishes.
- Parameters
school (numpy.ndarray) – Current school fish population.
- Returns
New school fish population.
- Return type
numpy.ndarray
-
generate_uniform_coordinates
(task)[source]¶ Return Numpy array with uniform distribution.
- Parameters
task (Task) – Optimization task.
- Returns
Array with uniform distribution.
- Return type
numpy.ndarray
-
individual_movement
(school, curr_step_individual, xb, fxb, task)[source]¶ Perform individual movement for each fish.
- Parameters
school (numpy.ndarray) – School fish population.
curr_step_individual (numpy.ndarray) – TODO
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/objecive value.
task (Task) – Optimization task.
- Returns
New school of fishes.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
initPopulation
(task)[source]¶ Initialize the school.
- Parameters
task (Task) – Optimization task.
- Returns
TODO.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, dict]
-
init_fish
(pos, task)[source]¶ Create a new fish to a given position.
- Parameters
pos –
task (Task) –
Returns:
-
max_delta_cost
(school)[source]¶ Find maximum delta cost - return 0 if none of the fishes moved.
- Parameters
school (numpy.ndarray) –
Returns:
-
runIteration
(task, school, fschool, xb, fxb, curr_step_individual, curr_step_volitive, curr_weight_school, prev_weight_school, **dparams)[source]¶ Core function of algorithm.
- Parameters
task (Task) –
school (numpy.ndarray) –
fschool (numpy.ndarray) –
best_fish (numpy.ndarray) –
fxb (float) –
curr_step_individual –
curr_step_volitive –
curr_weight_school –
prev_weight_school –
**dparams –
- Returns
TODO.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
-
setParameters
(NP=25, SI_init=3, SI_final=10, SV_init=3, SV_final=13, min_w=0.3, w_scale=0.7, **ukwargs)[source]¶ Set core arguments of FishSchoolSearch algorithm.
- Parameters
NP (Optional[int]) – Number of fishes in school.
SI_init (Optional[int]) – Length of initial individual step.
SI_final (Optional[int]) – Length of final individual step.
SV_init (Optional[int]) – Length of initial volatile step.
SV_final (Optional[int]) – Length of final volatile step.
min_w (Optional[float]) – Minimum weight of a fish.
w_scale (Optional[float]) – Maximum weight of a fish.
-
total_school_weight
(school, prev_weight_school, curr_weight_school)[source]¶ Calculate and update current weight of fish school.
- Parameters
school (numpy.ndarray) –
prev_weight_school (numpy.ndarray) –
curr_weight_school (numpy.ndarray) –
- Returns
TODO.
- Return type
Tuple[numpy.ndarray, numpy.ndarray]
-
class
NiaPy.algorithms.basic.
FlowerPollinationAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Flower Pollination algorithm.
- Algorithm:
Flower Pollination algorithm
- Date:
2018
- Authors:
Dusan Fister, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Yang, Xin-She. “Flower pollination algorithm for global optimization. International conference on unconventional computing and natural computation. Springer, Berlin, Heidelberg, 2012.
- References URL:
Implementation is based on the following MATLAB code: https://www.mathworks.com/matlabcentral/fileexchange/45112-flower-pollination-algorithm?requestedDomain=true
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['FlowerPollinationAlgorithm', 'FPA']¶
-
static
algorithmInfo
()[source]¶ Get default information of algorithm.
- Returns
Basic information.
- Return type
-
initPopulation
(task)[source]¶ Initialize starting population of optimization algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
New population.
New population fitness values.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
repair
(x, task)[source]¶ Repair solution to search space.
- Parameters
x (numpy.ndarray) – Solution to fix.
task (Task) – Optimization task.
- Returns
fixed solution.
- Return type
numpy.ndarray
-
runIteration
(task, Sol, Sol_f, xb, fxb, S, **dparams)[source]¶ Core function of FlowerPollinationAlgorithm algorithm.
-
class
NiaPy.algorithms.basic.
ForestOptimizationAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Forest Optimization Algorithm.
- Algorithm:
Forest Optimization Algorithm
- Date:
2019
- Authors:
Luka Pečnik
- License:
MIT
- Reference paper:
Manizheh Ghaemi, Mohammad-Reza Feizi-Derakhshi, Forest Optimization Algorithm, Expert Systems with Applications, Volume 41, Issue 15, 2014, Pages 6676-6687, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2014.05.009.
- References URL:
Implementation is based on the following MATLAB code: https://github.com/cominsys/FOA
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['ForestOptimizationAlgorithm', 'FOA']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO.
- Return type
Dict[str, Any]
-
globalSeeding
(task, candidates, size)[source]¶ Global optimum search stage that should prevent getting stuck in a local optimum.
- Parameters
task (Task) – Optimization task.
candidates (numpy.ndarray) – Candidate population for global seeding.
size (int) – Number of trees to produce.
- Returns
Resulting trees.
- Return type
numpy.ndarray
-
localSeeding
(task, trees)[source]¶ Local optimum search stage.
- Parameters
task (Task) – Optimization task.
trees (numpy.ndarray) – Zero age trees for local seeding.
- Returns
Resulting zero age trees.
- Return type
numpy.ndarray
-
removeLifeTimeExceeded
(trees, candidates, age)[source]¶ Remove dead trees.
- Parameters
trees (numpy.ndarray) – Population to test.
candidates (numpy.ndarray) – Candidate population array to be updated.
age (numpy.ndarray[int32]) – Age of trees.
- Returns
Alive trees.
New candidate population.
Age of trees.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray[int32]]
-
runIteration
(task, Trees, Evaluations, xb, fxb, age, **dparams)[source]¶ Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
Trees (numpy.ndarray) – Current population.
Evaluations (numpy.ndarray[float]) – Current population function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individual fitness/function value.
age (numpy.ndarray[int32]) – Age of trees.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
- Additional arguments:
age (numpy.ndarray[int32]): Age of trees.
- Return type
-
setParameters
(NP=10, lt=3, al=10, lsc=1, gsc=1, tr=0.3, **ukwargs)[source]¶ Set the parameters of the algorithm.
- Parameters
NP (Optional[int]) – Population size.
lt (Optional[int]) – Life time parameter.
al (Optional[int]) – Area limit parameter.
lsc (Optional[int]) – Local seeding changes parameter.
gsc (Optional[int]) – Global seeding changes parameter.
tr (Optional[float]) – Transfer rate parameter.
ukwargs (Dict[str, Any]) – Additional arguments.
-
survivalOfTheFittest
(task, trees, candidates, age)[source]¶ Evaluate and filter current population.
- Parameters
task (Task) – Optimization task.
trees (numpy.ndarray) – Population to evaluate.
candidates (numpy.ndarray) – Candidate population array to be updated.
age (numpy.ndarray[int32]) – Age of trees.
- Returns
Trees sorted by fitness value.
Updated candidate population.
Population fitness values.
Age of trees
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray[float], numpy.ndarray[int32]]
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
lt (Callable[[int], bool]): Checks if life time parameter has a proper value.
al (Callable[[int], bool]): Checks if area limit parameter has a proper value.
lsc (Callable[[int], bool]): Checks if local seeding changes parameter has a proper value.
gsc (Callable[[int], bool]): Checks if global seeding changes parameter has a proper value.
tr (Callable[[float], bool]): Checks if transfer rate parameter has a proper value.
- Return type
Dict[str, Callable]
See also
NiaPy.algorithms.algorithm.Algorithm.typeParameters()
-
class
NiaPy.algorithms.basic.
GeneticAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Genetic Algorithm.
- Algorithm:
Genetic algorithm
- Date:
2018
- Author:
Klemen Berkovič
- Reference paper:
Goldberg, David (1989). Genetic Algorithms in Search, Optimization and Machine Learning. Reading, MA: Addison-Wesley Professional.
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm name.
Ts (int) – Tournament size.
Mr (float) – Mutation rate.
Cr (float) – Crossover rate.
Selection (Callable[[numpy.ndarray[Individual], int, int, Individual, mtrand.RandomState], Individual]) – Selection operator.
Crossover (Callable[[numpy.ndarray[Individual], int, float, mtrand.RandomState], Individual]) – Crossover operator.
Mutation (Callable[[numpy.ndarray[Individual], int, float, Task, mtrand.RandomState], Individual]) – Mutation operator.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GeneticAlgorithm', 'GA']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of GeneticAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals function/fitness value.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best solution
New global best solutions fitness/objective value
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(NP=25, Ts=5, Mr=0.25, Cr=0.25, Selection=<function TournamentSelection>, Crossover=<function UniformCrossover>, Mutation=<function UniformMutation>, **ukwargs)[source]¶ Set the parameters of the algorithm.
- Parameters
NP (Optional[int]) – Population size.
Ts (Optional[int]) – Tournament selection.
Mr (Optional[int]) – Mutation rate.
Cr (Optional[float]) – Crossover rate.
Selection (Optional[Callable[[numpy.ndarray[Individual], int, int, Individual, mtrand.RandomState], Individual]]) – Selection operator.
Crossover (Optional[Callable[[numpy.ndarray[Individual], int, float, mtrand.RandomState], Individual]]) – Crossover operator.
Mutation (Optional[Callable[[numpy.ndarray[Individual], int, float, Task, mtrand.RandomState], Individual]]) – Mutation operator.
See also
- Selection:
NiaPy.algorithms.basic.TournamentSelection()
NiaPy.algorithms.basic.RouletteSelection()
- Crossover:
NiaPy.algorithms.basic.UniformCrossover()
NiaPy.algorithms.basic.TwoPointCrossover()
NiaPy.algorithms.basic.MultiPointCrossover()
NiaPy.algorithms.basic.CrossoverUros()
- Mutations:
NiaPy.algorithms.basic.UniformMutation()
NiaPy.algorithms.basic.CreepMutation()
NiaPy.algorithms.basic.MutationUros()
-
class
NiaPy.algorithms.basic.
GlowwormSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of glowworm swarm optimization.
- Algorithm:
Glowworm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
- Variables
Name (List[str]) – List of strings represeinting algorithm name.
l0 (float) – Initial luciferin quantity for each glowworm.
nt (float) –
–
rs (float) – Maximum sensing range.
rho (float) – Luciferin decay constant.
gamma (float) – Luciferin enhancement constant.
beta (float) –
–
s (float) –
–
Distance (Callable[[numpy.ndarray, numpy.ndarray], float]]) – Measure distance between two individuals.
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GlowwormSwarmOptimization', 'GSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information.
- Return type
-
initPopulation
(task)[source]¶ Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population of glowwarms.
Initialized populations function/fitness values.
- Additional arguments:
L (numpy.ndarray): TODO.
R (numpy.ndarray): TODO.
rs (numpy.ndarray): TODO.
- Return type
-
runIteration
(task, GS, GS_f, xb, fxb, L, R, rs, **dparams)[source]¶ Core function of GlowwormSwarmOptimization algorithm.
- Parameters
task (Task) – Optimization taks.
GS (numpy.ndarray) – Current population.
GS_f (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individuals function/fitness value.
L (numpy.ndarray) –
R (numpy.ndarray) –
rs (numpy.ndarray) –
Dict[str (**dparams) – Additional arguments.
Any] – Additional arguments.
- Returns
Initialized population of glowwarms.
Initialized populations function/fitness values.
New global best solution
New global best sloutions fitness/objective value.
- Additional arguments:
L (numpy.ndarray): TODO.
R (numpy.ndarray): TODO.
rs (numpy.ndarray): TODO.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(n=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, Distance=<function euclidean>, **ukwargs)[source]¶ Set the arguments of an algorithm.
- Parameters
n (Optional[int]) – Number of glowworms in population.
l0 (Optional[float]) – Initial luciferin quantity for each glowworm.
nt (Optional[float]) –
–
rs (Optional]float]) – Maximum sensing range.
rho (Optional[float]) – Luciferin decay constant.
gamma (Optional[float]) – Luciferin enhancement constant.
beta (Optional[float]) –
–
s (Optional[float]) –
–
Distance (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]]) – Measure distance between two individuals.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
n (Callable[[int], bool])
l0 (Callable[[Union[float, int]], bool])
nt (Callable[[Union[float, int]], bool])
rho (Callable[[Union[float, int]], bool])
gamma (Callable[[float], bool])
beta (Callable[[float], bool])
s (Callable[[float], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.basic.
GlowwormSwarmOptimizationV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowwarm swarm optimization.
- Algorithm:
Glowwarm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GlowwormSwarmOptimizationV1', 'GSOv1']¶
-
class
NiaPy.algorithms.basic.
GlowwormSwarmOptimizationV2
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowwarm swarm optimization.
- Algorithm:
Glowwarm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GlowwormSwarmOptimizationV2', 'GSOv2']¶
-
class
NiaPy.algorithms.basic.
GlowwormSwarmOptimizationV3
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization
Implementation of glowwarm swarm optimization.
- Algorithm:
Glowwarm Swarm Optimization Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GlowwormSwarmOptimizationV3', 'GSOv3']¶
-
class
NiaPy.algorithms.basic.
GravitationalSearchAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Gravitational Search Algorithm.
- Algorithm:
Gravitational Search Algorithm
- Date:
2018
- Author:
Klemen Berkoivč
- License:
MIT
- Reference URL:
- Reference paper:
Esmat Rashedi, Hossein Nezamabadi-pour, Saeid Saryazdi, GSA: A Gravitational Search Algorithm, Information Sciences, Volume 179, Issue 13, 2009, Pages 2232-2248, ISSN 0020-0255
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GravitationalSearchAlgorithm', 'GSA']¶
-
static
algorithmInfo
()[source]¶ Get algorithm information.
- Returns
Algorithm information.
- Return type
-
getParameters
()[source]¶ Get algorithm parameters values.
- Returns
TODO.
- Return type
Dict[str, Any]
See also
NiaPy.algorithms.algorithm.Algorithm.getParameters()
-
initPopulation
(task)[source]¶ Initialize staring population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
- Additional arguments:
v (numpy.ndarray[float]): TODO
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
runIteration
(task, X, X_f, xb, fxb, v, **dparams)[source]¶ Core function of GravitationalSearchAlgorithm algorithm.
- Parameters
task (Task) – Optimization task.
X (numpy.ndarray) – Current population.
X_f (numpy.ndarray) – Current populations fitness/function values.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best fitness/function value.
v (numpy.ndarray) – TODO
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New populations fitness/function values.
New global best solution
New global best solutions fitness/objective value
- Additional arguments:
v (numpy.ndarray): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
GreyWolfOptimizer
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Grey wolf optimizer.
- Algorithm:
Grey wolf optimizer
- Date:
2018
- Author:
Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Mirjalili, Seyedali, Seyed Mohammad Mirjalili, and Andrew Lewis. “Grey wolf optimizer.” Advances in engineering software 69 (2014): 46-61.
Grey Wolf Optimizer (GWO) source code version 1.0 (MATLAB) from MathWorks
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['GreyWolfOptimizer', 'GWO']¶
-
static
algorithmInfo
()[source]¶ Get algorithm information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task)[source]¶ Initialize population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations fitness/function values.
- Additional arguments:
A (): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
runIteration
(task, pop, fpop, xb, fxb, A, A_f, B, B_f, D, D_f, **dparams)[source]¶ Core funciton of GreyWolfOptimizer algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations function/fitness values.
xb (numpy.ndarray) –
fxb (float) –
A (numpy.ndarray) –
A_f (float) –
B (numpy.ndarray) –
B_f (float) –
D (numpy.ndarray) –
D_f (float) –
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population
New population fitness/function values
- Additional arguments:
A (): TODO
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
HarmonySearch
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of harmony search algorithm.
- Algorithm:
Harmony Search Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://link.springer.com/chapter/10.1007/978-3-642-00185-7_1
- Reference paper:
Geem, Z. W., Kim, J. H., & Loganathan, G. V. (2001). A new heuristic optimization algorithm: harmony search. Simulation, 76(2), 60-68.
- Variables
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HarmonySearch', 'HS']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
bw
(task)[source]¶ Get bandwidth.
- Parameters
task (Task) – Optimization task.
- Returns
Bandwidth.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
improvize
(HM, task)[source]¶ Create new individual.
- Parameters
HM (numpy.ndarray) – Current population.
task (Task) – Optimization task.
- Returns
New individual.
- Return type
numpy.ndarray
-
initPopulation
(task)[source]¶ Initialize first population.
- Parameters
task (Task) – Optimization task.
- Returns
New harmony/population.
New population fitness/function values.
Additional parameters.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
runIteration
(task, HM, HM_f, xb, fxb, **dparams)[source]¶ Core function of HarmonySearch algorithm.
-
class
NiaPy.algorithms.basic.
HarmonySearchV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.hs.HarmonySearch
Implementation of harmony search algorithm.
- Algorithm:
Harmony Search Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://link.springer.com/chapter/10.1007/978-3-642-00185-7_1
- Reference paper:
Yang, Xin-She. “Harmony search as a metaheuristic algorithm.” Music-inspired harmony search algorithm. Springer, Berlin, Heidelberg, 2009. 1-14.
- Variables
See also
NiaPy.algorithms.basic.hs.HarmonySearch
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HarmonySearchV1', 'HSv1']¶
-
static
algorithmInfo
()[source]¶ Get basic information about algorihtm.
- Returns
Basic information.
- Return type
-
bw
(task)[source]¶ Get new bandwidth.
- Parameters
task (Task) – Optimization task.
- Returns
New bandwidth.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
class
NiaPy.algorithms.basic.
HarrisHawksOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Harris Hawks Optimization algorithm.
- Algorithm:
Harris Hawks Optimization
- Date:
2020
- Authors:
Francisco Jose Solis-Munoz
- License:
MIT
- Reference paper:
Heidari et al. “Harris hawks optimization: Algorithm and applications”. Future Generation Computer Systems. 2019. Vol. 97. 849-872.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HarrisHawksOptimization', 'HHO']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]¶ Initialize the starting population.
-
levy_function
(dims, step=0.01, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]¶ Calculate levy function.
-
runIteration
(task, Sol, Fitness, xb, fxb, **dparams)[source]¶ Core function of Harris Hawks Optimization.
- Parameters
task (Task) – Optimization task.
Sol (numpy.ndarray) – Current population
Fitness (numpy.ndarray[float]) – Current population fitness/funciton values
xb (numpy.ndarray) – Current best individual
fxb (float) – Current best individual function/fitness value
dparams (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function vlues
New global best solution
New global best fitness/objective value
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
KrillHerdV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.kh.KrillHerd
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://www.sciencedirect.com/science/article/pii/S1007570412002171
- Reference paper:
Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.
See also
:func:NiaPy.algorithms.basic.kh.KrillHerd.KrillHerd`
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['KrillHerdV1', 'KHv1']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
crossover
(x, xo, Cr)[source]¶ Preform a crossover operation on individual.
- Parameters
x (numpy.ndarray) – Current individual.
xo (numpy.ndarray) – New individual.
Cr (float) – Crossover probability.
- Returns
Crossover individual.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
KrillHerdV11
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.kh.KrillHerd
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
ElitistSelection
(KH, KH_f, KHo, KHo_f)[source]¶ Select krills/individuals that are better than odl krills.
- Parameters
- Returns
New herd/population.
New herd/populations function/fitness values.
- Return type
Tuple[numpy.ndarray, numpy.numpy[float]]
-
Foraging
(KH, KH_f, KHo, KHo_f, W_f, F, KH_wf, KH_bf, x_food, x_food_f, task)[source]¶ Foraging operator.
- Parameters
KH (numpy.ndarray) – Current heard/population.
KH_f (numpy.ndarray[float]) – Current herd/populations function/fitness values.
KHo (numpy.ndarray) – New heard/population.
KHo_f (numpy.ndarray[float]) – New heard/population function/fitness values.
W_f (numpy.ndarray) – Weights for foraging.
() (F) –
–
KH_wf (numpy.ndarray) – Worst krill in herd/population.
KH_bf (numpy.ndarray) – Best krill in herd/population.
x_food (numpy.ndarray) – Foods position.
x_food_f (float) – Foods function/fitness value.
task (Task) – Optimization task.
- Returns
–
- Return type
numpy.ndarray
-
Name
= ['KrillHerdV11', 'KHv11']¶
-
Neighbors
(i, KH, KH_f, iw, ib, N, W_n, task)[source]¶ Neighbors operator.
- Parameters
i (int) – Index of krill being applied with operator.
KH (numpy.ndarray) – Current herd/population.
KH_f (numpy.ndarray[float]) – Current herd/populations function/fitness values.
iw (int) – Index of worst krill/individual.
ib (int) – Index of best krill/individual.
() (N) –
–
W_n (numpy.ndarray) – Weights for neighbors operator.
task (Task) – Optimization task.
- Returns
–
- Return type
numpy.ndarray
-
runIteration
(task, KH, KH_f, xb, fxb, KHo, KHo_f, N, F, Dt, **dparams)[source]¶ Core function of KrillHerdV11 algorithm.
- Parameters
task (Task) – Optimization task.
KH (numpy.ndarray) – Current herd/population.
KH_f (numpy.ndarray[float]) – Current herd/populations function/fitness values.
xb (numpy.ndarray) – Global best krill.
fxb (float) – Global best krill function/fitness value.
() (Dt) –
() –
() –
() –
() –
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New herd/population.
New herd/populations function/fitness values.
Additional arguments:
- Return type
-
class
NiaPy.algorithms.basic.
KrillHerdV2
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.kh.KrillHerd
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://www.sciencedirect.com/science/article/pii/S1007570412002171
- Reference paper:
Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['KrillHerdV2', 'KHv2']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
class
NiaPy.algorithms.basic.
KrillHerdV3
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.kh.KrillHerd
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://www.sciencedirect.com/science/article/pii/S1007570412002171
- Reference paper:
Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['KrillHerdV3', 'KHv3']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
class
NiaPy.algorithms.basic.
KrillHerdV4
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.kh.KrillHerd
Implementation of krill herd algorithm.
- Algorithm:
Krill Herd Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
http://www.sciencedirect.com/science/article/pii/S1007570412002171
- Reference paper:
Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['KrillHerdV4', 'KHv4']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
setParameters
(NP=50, N_max=0.01, V_f=0.02, D_max=0.002, C_t=0.93, W_n=0.42, W_f=0.38, d_s=2.63, **ukwargs)[source]¶ Set algorithm core parameters.
- Parameters
NP (int) – Number of kills in herd.
N_max (Optional[float]) – TODO
V_f (Optional[float]) – TODO
D_max (Optional[float]) – TODO
C_t (Optional[float]) – TODO
W_n (Optional[Union[int, float, numpy.ndarray, list]]) – Weights for neighborhood.
W_f (Optional[Union[int, float, numpy.ndarray, list]]) – Weights for foraging.
d_s (Optional[float]) – TODO
**ukwargs (Dict[str, Any]) – Additional arguments.
See also
:func:NiaPy.algorithms.basic.kh.KrillHerd.KrillHerd.setParameters`
-
class
NiaPy.algorithms.basic.
MonarchButterflyOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Monarch Butterfly Optimization.
- Algorithm:
Monarch Butterfly Optimization
- Date:
2019
- Authors:
Jan Banko
- License:
MIT
- Reference paper:
Wang, G. G., Deb, S., & Cui, Z. (2019). Monarch butterfly optimization. Neural computing and applications, 31(7), 1995-2014.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MonarchButterflyOptimization', 'MBO']¶
-
adjustingOperator
(t, max_t, D, NP1, NP2, Butterflies, best)[source]¶ Apply the adjusting operator.
- Parameters
- Returns
Adjusted butterfly population.
- Return type
numpy.ndarray
-
static
algorithmInfo
()[source]¶ Get information of the algorithm.
- Returns
Algorithm information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
evaluateAndSort
(task, Butterflies)[source]¶ Evaluate and sort the butterfly population.
- Parameters
task (Task) – Optimization task
Butterflies (numpy.ndarray) – Current butterfly population.
- Returns
- Tuple[numpy.ndarray, float, numpy.ndarray]:
Best butterfly according to the evaluation.
The best fitness value.
Butterfly population.
- Return type
numpy.ndarray
-
getParameters
()[source]¶ Get parameters values for the algorithm.
- Returns
TODO.
- Return type
Dict[str, Any]
-
repair
(x, lower, upper)[source]¶ Truncate exceeded dimensions to the limits.
- Parameters
x (numpy.ndarray) – Individual to repair.
lower (numpy.ndarray) – Lower limits for dimensions.
upper (numpy.ndarray) – Upper limits for dimensions.
- Returns
Repaired individual.
- Return type
numpy.ndarray
-
runIteration
(task, Butterflies, Evaluations, xb, fxb, tmp_best, **dparams)[source]¶ Core function of Forest Optimization Algorithm.
- Parameters
task (Task) – Optimization task.
Butterflies (numpy.ndarray) – Current population.
Evaluations (numpy.ndarray[float]) – Current population function/fitness values.
xb (numpy.ndarray) – Global best individual.
fxb (float) – Global best individual fitness/function value.
tmp_best (numpy.ndarray) – Best individual currently.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
dx (float): A small value used in local seeding stage.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(NP=20, PAR=0.4166666666666667, PER=1.2, **ukwargs)[source]¶ Set the parameters of the algorithm.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
PAR (Callable[[float], bool]): Checks if partition parameter has a proper value.
PER (Callable[[float], bool]): Checks if period parameter has a proper value.
- Return type
Dict[str, Callable]
See also
NiaPy.algorithms.algorithm.Algorithm.typeParameters()
-
class
NiaPy.algorithms.basic.
MonkeyKingEvolutionV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of monkey king evolution algorithm version 1.
- Algorithm:
Monkey King Evolution version 1
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
- Variables
Name (List[str]) – List of strings representing algorithm names.
F (float) – Scale factor for normal particles.
R (float) – Procentual value of now many new particle Monkey King particle creates.
C (int) – Number of new particles generated by Monkey King particle.
FC (float) – Scale factor for Monkey King particles.
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MonkeyKingEvolutionV1', 'MKEv1']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information.
- Return type
-
moveMK
(x, task)[source]¶ Move Mokey King paticle.
For moving Monkey King particles algorithm uses next formula: \(\mathbf{x} + \mathit{FC} \odot \mathbf{R} \odot \mathbf{x}\) where \(\mathbf{R}\) is two dimensional array with shape {C * D, D}. Componentes of this array are in range [0, 1]
- Parameters
x (numpy.ndarray) – Monkey King patricle position.
task (Task) – Optimization task.
- Returns
New particles generated by Monkey King particle.
- Return type
numpy.ndarray
-
moveMokeyKingPartice
(p, task)[source]¶ Move Monky King Particles.
- Parameters
p (MkeSolution) – Monkey King particle to apply this function on.
task (Task) – Optimization task.
-
moveP
(x, x_pb, x_b, task)[source]¶ Move normal particle in search space.
For moving particles algorithm uses next formula: \(\mathbf{x_{pb} - \mathit{F} \odot \mathbf{r} \odot (\mathbf{x_b} - \mathbf{x})\) where \(\mathbf{r}\) is one dimension array with D components. Components in this vector are in range [0, 1].
- Parameters
x (numpy.ndarray) – Paticle position.
x_pb (numpy.ndarray) – Particle best position.
x_b (numpy.ndarray) – Best particle position.
task (Task) – Optimization task.
- Returns
Particle new position.
- Return type
numpy.ndarray
-
movePartice
(p, p_b, task)[source]¶ Move patricles.
- Parameters
p (MkeSolution) – Monke particle.
p_b (MkeSolution) – Population best particle.
task (Task) – Optimization task.
-
movePopulation
(pop, xb, task)[source]¶ Move population.
- Parameters
pop (numpy.ndarray[MkeSolution]) – Current population.
xb (MkeSolution) – Current best solution.
task (Task) – Optimization task.
- Returns
New particles.
- Return type
numpy.ndarray[MkeSolution]
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Monkey King Evolution v1 algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray[MkeSolution]) – Current population.
fpop (numpy.ndarray[float]) – Current population fitness/function values.
xb (MkeSolution) – Current best solution.
fxb (float) – Current best solutions function/fitness value.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
Initialized solutions.
Fitness/function values of solution.
Additional arguments.
- Return type
Tuple(numpy.ndarray[MkeSolution], numpy.ndarray[float], Dict[str, Any]]
-
setParameters
(NP=40, F=0.7, R=0.3, C=3, FC=0.5, **ukwargs)[source]¶ Set Monkey King Evolution v1 algorithms static parameters.
- Parameters
NP (int) – Population size.
F (float) – Scale factor for normal particle.
R (float) – Procentual value of now many new particle Monkey King particle creates. Value in rage [0, 1].
C (int) – Number of new particles generated by Monkey King particle.
FC (float) – Scale factor for Monkey King particles.
**ukwargs (Dict[str, Any]) – Additional arguments.
See also
NiaPy.algorithms.algorithm.Algorithm.setParameters()
-
class
NiaPy.algorithms.basic.
MonkeyKingEvolutionV2
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1
Implementation of monkey king evolution algorithm version 2.
- Algorithm:
Monkey King Evolution version 2
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
See also
NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MonkeyKingEvolutionV2', 'MKEv2']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information.
- Return type
-
moveMK
(x, dx, task)[source]¶ Move Monkey King particle.
For movment of particles algorithm uses next formula: \(\mathbf{x} - \mathit{FC} \odot \mathbf{dx}\)
- Parameters
x (numpy.ndarray) – Particle to apply movment on.
dx (numpy.ndarray) – Difference between to random paricles in population.
task (Task) – Optimization task.
- Returns
Moved particles.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
MonkeyKingEvolutionV3
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1
Implementation of monkey king evolution algorithm version 3.
- Algorithm:
Monkey King Evolution version 3
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705116000198
- Reference paper:
Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.
See also
NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MonkeyKingEvolutionV3', 'MKEv3']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information.
- Return type
-
initPopulation
(task)[source]¶ Initialize the population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized population function/fitness values.
- Additional arguments:
k (int): Starting number of rows to include from lower triangular matrix.
c (int): Constant.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
runIteration
(task, X, X_f, xb, fxb, k, c, **dparams)[source]¶ Core funciton of Monkey King Evolution v3 algorithm.
- Parameters
task (Task) – Optimization task.
X (numpy.ndarray) – Current population.
X_f (numpy.ndarray[float]) – Current population fitness/function values.
xb (numpy.ndarray) – Current best individual.
fxb (float) – Current best individual function/fitness value.
k (int) – Starting number of rows to include from lower triangular matrix.
(int (c) – Constant.
**dparams – Additional arguments
- Returns
Initialized population.
Initialized population function/fitness values.
- Additional arguments:
k (int): Starting number of rows to include from lower triangular matrix.
c (int): Constant.
- Return type
-
class
NiaPy.algorithms.basic.
MothFlameOptimizer
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
MothFlameOptimizer of Moth flame optimizer.
- Algorithm:
Moth flame optimizer
- Date:
2018
- Author:
Kivanc Guckiran and Klemen Berkovič
- License:
MIT
- Reference paper:
Mirjalili, Seyedali. “Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm.” Knowledge-Based Systems 89 (2015): 228-249.
See also
NiaPy.algorithms.algorithm.Algorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MothFlameOptimizer', 'MFO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information.
- Return type
-
initPopulation
(task)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task
- Returns
Initialized population
Initialized population function/fitness values
- Additional arguments:
best_flames (numpy.ndarray): Best individuals
best_flame_fitness (numpy.ndarray): Best individuals fitness/function values
previous_population (numpy.ndarray): Previous population
previous_fitness (numpy.ndarray[float]): Previous population fitness/function values
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
-
runIteration
(task, moth_pos, moth_fitness, xb, fxb, best_flames, best_flame_fitness, previous_population, previous_fitness, **dparams)[source]¶ Core function of MothFlameOptimizer algorithm.
- Parameters
task (Task) – Optimization task.
moth_pos (numpy.ndarray) – Current population.
moth_fitness (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current population best individual.
fxb (float) – Current best individual.
best_flames (numpy.ndarray) – Best found individuals.
best_flame_fitness (numpy.ndarray) – Best found individuals fitness/function values.
previous_population (numpy.ndarray) – Previous population.
previous_fitness (numpy.ndarray) – Previous population fitness/function values.
**dparams (Dict[str, Any]) – Additional parameters
- Returns
New population.
New population fitness/function values.
New global best solution.
New global best fitness/objective value.
- Additional arguments:
best_flames (numpy.ndarray): Best individuals.
best_flame_fitness (numpy.ndarray): Best individuals fitness/function values.
previous_population (numpy.ndarray): Previous population.
previous_fitness (numpy.ndarray): Previous population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.basic.
MultiStrategyDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential evolution algorithm with multiple mutation strateys.
- Algorithm:
Implementation of Differential evolution algorithm with multiple mutation strateys
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm names.
strategies (Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, mtrand.RandomState], numpy.ndarray[Individual]]]) – List of mutation strategies.
CrossMutt (Callable[[numpy.ndarray[Individual], int, Individual, float, float, Task, Individual, Iterable[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, Dict[str, Any]], Individual]]], Individual]) – Multi crossover and mutation combiner function.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MultiStrategyDifferentialEvolution', 'MsDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve population with the help multiple mutation strategies.
- Parameters
pop (numpy.ndarray) – Current population.
xb (numpy.ndarray) – Current best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population of individuals.
- Return type
numpy.ndarray
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO.
- Return type
Dict[str, Any]
-
setParameters
(strategies=(<function CrossRand1>, <function CrossBest1>, <function CrossCurr2Best1>, <function CrossRand2>), **ukwargs)[source]¶ Set the arguments of the algorithm.
- Parameters
strategies (Optional[Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, mtrand.RandomState], numpy.ndarray[Individual]]]]) – List of mutation strategyis.
CrossMutt (Optional[Callable[[numpy.ndarray[Individual], int, Individual, float, float, Task, Individual, Iterable[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, Dict[str, Any]], Individual]]], Individual]]) – Multi crossover and mutation combiner function.
-
class
NiaPy.algorithms.basic.
MutatedCenterParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.CenterParticleSwarmOptimization
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Center Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
TODO find one
- Variables
nmutt (int) – Number of mutations of global best particle.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MutatedCenterParticleSwarmOptimization', 'MCPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
(float (fxb) – Current global best particles function/fitness value.
**dparams – Additional arguments.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
- Return type
See also
NiaPy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.runIteration()
-
class
NiaPy.algorithms.basic.
MutatedCenterUnifiedParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.MutatedCenterParticleSwarmOptimization
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Center Unified Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Tsai, Hsing-Chih. “Unified particle swarm delivers high efficiency to particle swarm optimization.” Applied Soft Computing 55 (2017): 371-383.
- Variables
nmutt (int) – Number of mutations of global best particle.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MutatedCenterUnifiedParticleSwarmOptimization', 'MCUPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
setParameters
(**kwargs)[source]¶ Set core algorithm parameters.
- Parameters
**kwargs – Additional arguments.
See also
NiaPy.algorithm.basic.MutatedCenterParticleSwarmOptimization.setParameters()
-
updateVelocity
(V, p, pb, gb, w, vMin, vMax, task, **kwargs)[source]¶ Update particle velocity.
- Parameters
V (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
gb (numpy.ndarray) – Global best position of particle.
w (numpy.ndarray) – Weights for velocity adjustment.
vMin (numpy.ndarray) – Minimal velocity allowed.
vMax (numpy.ndarray) – Maxmimal velocity allowed.
task (Task) – Optimization task.
kwargs – Additional arguments.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
MutatedParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Mutated Particle Swarm Optimization.
- Algorithm:
Mutated Particle Swarm Optimization
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Wang, C. Li, Y. Liu, S. Zeng, A hybrid particle swarm algorithm with cauchy mutation, Proceedings of the 2007 IEEE Swarm Intelligence Symposium (2007) 356–360.
- Variables
nmutt (int) – Number of mutations of global best particle.
See also
NiaPy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MutatedParticleSwarmOptimization', 'MPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population of particles.
fpop (numpy.ndarray) – Current particles function/fitness values.
xb (numpy.ndarray) – Current global best particle.
fxb (float) – Current global best particles function/fitness value.
**dparams – Additional arguments.
- Returns
New population of particles.
New populations function/fitness values.
New global best particle.
New global best particle function/fitness value.
Additional arguments.
- Return type
See also
NiaPy.algorithm.basic.WeightedVelocityClampingParticleSwarmAlgorithm.runIteration()
-
class
NiaPy.algorithms.basic.
OppositionVelocityClampingParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Opposition-Based Particle Swarm Optimization with Velocity Clamping.
- Algorithm:
Opposition-Based Particle Swarm Optimization with Velocity Clamping
- Date:
2019
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Shahzad, Farrukh, et al. “Opposition-based particle swarm optimization with velocity clamping (OVCPSO).” Advances in Computational Intelligence. Springer, Berlin, Heidelberg, 2009. 339-348
- Variables
p0 – Probability of opposite learning phase.
w_min – Minimum inertial weight.
w_max – Maximum inertial weight.
sigma – Velocity scaling factor.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['OppositionVelocityClampingParticleSwarmOptimization', 'OVCPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
initPopulation
(task)[source]¶ Init starting population and dynamic parameters.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness values.
- Additional arguments:
popb (numpy.ndarray): particles best population.
fpopb (numpy.ndarray[float]): particles best positions function/fitness value.
vMin (numpy.ndarray): Minimal velocity.
vMax (numpy.ndarray): Maximal velocity.
V (numpy.ndarray): Initial velocity of particle.
S_u (numpy.ndarray): Upper bound for opposite learning.
S_l (numpy.ndarray): Lower bound for opposite learning.
- Return type
Tuple[np.ndarray, np.ndarray, dict]
-
oppositeLearning
(S_l, S_h, pop, fpop, task)[source]¶ Run opposite learning phase.
- Parameters
S_l (numpy.ndarray) – Lower limit of opposite particles.
S_h (numpy.ndarray) – Upper limit of opposite particles.
pop (numpy.ndarray) – Current populations positions.
fpop (numpy.ndarray) – Current populations functions/fitness values.
task (Task) – Optimization task.
- Returns
New particles position
New particles function/fitness values
New best position of opposite learning phase
new best function/fitness value of opposite learning phase
- Return type
Tuple[np.ndarray, np.ndarray, np.ndarray, float]
-
runIteration
(task, pop, fpop, xb, fxb, popb, fpopb, vMin, vMax, V, S_l, S_h, **dparams)[source]¶ Core function of Opposite-based Particle Swarm Optimization with velocity clamping algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current population.
fpop (numpy.ndarray) – Current populations function/fitness values.
xb (numpy.ndarray) – Current global best position.
fxb (float) – Current global best positions function/fitness value.
popb (numpy.ndarray) – Personal best position.
fpopb (numpy.ndarray) – Personal best positions function/fitness values.
vMin (numpy.ndarray) – Minimal allowed velocity.
vMax (numpy.ndarray) – Maximal allowed velocity.
V (numpy.ndarray) – Populations velocity.
S_l (numpy.ndarray) – Lower bound of opposite learning.
S_h (numpy.ndarray) – Upper bound of opposite learning.
**dparams – Additional arguments.
- Returns
New population.
New populations function/fitness values.
New global best position.
New global best positions function/fitness value.
- Additional arguments:
popb: particles best population.
fpopb: particles best positions function/fitness value.
vMin: Minimal velocity.
vMax: Maximal velocity.
V: Initial velocity of particle.
S_u: Upper bound for opposite learning.
S_l: Lower bound for opposite learning.
- Return type
-
setParameters
(p0=0.3, w_min=0.4, w_max=0.9, sigma=0.1, C1=1.49612, C2=1.49612, **kwargs)[source]¶ Set core algorithm parameters.
- Parameters
p0 (float) – Probability of running Opposite learning.
w_min (numpy.ndarray) – Minimal value of weights.
w_max (numpy.ndarray) – Maximum value of weights.
sigma (numpy.ndarray) – Velocity range factor.
**kwargs – Additional arguments.
See also
NiaPy.algorithm.basic.ParticleSwarmAlgorithm.setParameters()
-
class
NiaPy.algorithms.basic.
ParticleSwarmAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Particle Swarm Optimization algorithm.
- Algorithm:
Particle Swarm Optimization algorithm
- Date:
2018
- Authors:
Lucija Brezočnik, Grega Vrbančič, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.
- Variables
Name (List[str]) – List of strings representing algorithm names
C1 (float) – Cognitive component.
C2 (float) – Social component.
vMin (Union[float, numpy.ndarray[float]]) – Minimal velocity.
vMax (Union[float, numpy.ndarray[float]]) – Maximal velocity.
Repair (Callable[[numpy.ndarray, numpy.ndarray, numpy.ndarray, mtrnd.RandomState], numpy.ndarray]) – Repair method for velocity.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['WeightedVelocityClampingParticleSwarmAlgorithm', 'WVCPSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
initPopulation
(task)[source]¶ Initialize population and dynamic arguments of the Particle Swarm Optimization algorithm.
- Parameters
task – Optimization task.
- Returns
Initial population.
Initial population fitness/function values.
- Additional arguments:
popb (numpy.ndarray): particles best population.
fpopb (numpy.ndarray[float]): particles best positions function/fitness value.
w (numpy.ndarray): Inertial weight.
vMin (numpy.ndarray): Minimal velocity.
vMax (numpy.ndarray): Maximal velocity.
V (numpy.ndarray): Initial velocity of particle.
- Return type
Tuple[np.ndarray, np.ndarray, dict]
-
runIteration
(task, pop, fpop, xb, fxb, popb, fpopb, w, vMin, vMax, V, **dparams)[source]¶ Core function of Particle Swarm Optimization algorithm.
- Parameters
task (Task) – Optimization task.
pop (numpy.ndarray) – Current populations.
fpop (numpy.ndarray) – Current population fitness/function values.
xb (numpy.ndarray) – Current best particle.
fxb (float) – Current best particle fitness/function value.
popb (numpy.ndarray) – Particles best position.
fpopb (numpy.ndarray) – Particles best positions fitness/function values.
w (numpy.ndarray) – Inertial weights.
vMin (numpy.ndarray) – Minimal velocity.
vMax (numpy.ndarray) – Maximal velocity.
V (numpy.ndarray) – Velocity of particles.
**dparams – Additional function arguments.
- Returns
New population.
New population fitness/function values.
New global best position.
New global best positions function/fitness value.
- Additional arguments:
popb (numpy.ndarray): Particles best population.
fpopb (numpy.ndarray[float]): Particles best positions function/fitness value.
w (numpy.ndarray): Inertial weight.
vMin (numpy.ndarray): Minimal velocity.
vMax (numpy.ndarray): Maximal velocity.
V (numpy.ndarray): Initial velocity of particle.
- Return type
See also
NiaPy.algorithms.algorithm.runIteration
-
setParameters
(NP=25, C1=2.0, C2=2.0, w=0.7, vMin=-1.5, vMax=1.5, Repair=<function reflectRepair>, **ukwargs)[source]¶ Set Particle Swarm Algorithm main parameters.
- Parameters
NP (int) – Population size
C1 (float) – Cognitive component.
C2 (float) – Social component.
w (Union[float, numpy.ndarray]) – Inertial weight.
vMin (Union[float, numpy.ndarray]) – Minimal velocity.
vMax (Union[float, numpy.ndarray]) – Maximal velocity.
Repair (Callable[[np.ndarray, np.ndarray, np.ndarray, dict], np.ndarray]) – Repair method for velocity.
**ukwargs – Additional arguments
-
updateVelocity
(V, p, pb, gb, w, vMin, vMax, task, **kwargs)[source]¶ Update particle velocity.
- Parameters
V (numpy.ndarray) – Current velocity of particle.
p (numpy.ndarray) – Current position of particle.
pb (numpy.ndarray) – Personal best position of particle.
gb (numpy.ndarray) – Global best position of particle.
w (numpy.ndarray) – Weights for velocity adjustment.
vMin (numpy.ndarray) – Minimal velocity allowed.
vMax (numpy.ndarray) – Maximal velocity allowed.
task (Task) – Optimization task.
kwargs – Additional arguments.
- Returns
Updated velocity of particle.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.basic.
ParticleSwarmOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.pso.ParticleSwarmAlgorithm
Implementation of Particle Swarm Optimization algorithm.
- Algorithm:
Particle Swarm Optimization algorithm
- Date:
2018
- Authors:
Lucija Brezočnik, Grega Vrbančič, Iztok Fister Jr. and Klemen Berkovič
- License:
MIT
- Reference paper:
Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.
- Variables
See also
NiaPy.algorithms.basic.WeightedVelocityClampingParticleSwarmAlgorithm
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['ParticleSwarmAlgorithm', 'PSO']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
class
NiaPy.algorithms.basic.
SineCosineAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of sine cosine algorithm.
- Algorithm:
Sine Cosine Algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
https://www.sciencedirect.com/science/article/pii/S0950705115005043
- Reference paper:
Seyedali Mirjalili, SCA: A Sine Cosine Algorithm for solving optimization problems, Knowledge-Based Systems, Volume 96, 2016, Pages 120-133, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2015.12.022.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['SineCosineAlgorithm', 'SCA']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
getParameters
()[source]¶ Get algorithm parameters values.
- Returns
- Return type
Dict[str, Any]
See also
NiaPy.algorithms.algorithm.Algorithm.getParameters()
-
initPopulation
(task)[source]¶ Initialize the individuals.
- Parameters
task (Task) – Optimization task
- Returns
Initialized population of individuals
Function/fitness values for individuals
Additional arguments
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
nextPos
(x, x_b, r1, r2, r3, r4, task)[source]¶ Move individual to new position in search space.
- Parameters
x (numpy.ndarray) – Individual represented with components.
x_b (nmppy.ndarray) – Best individual represented with components.
r1 (float) – Number dependent on algorithm iteration/generations.
r2 (float) – Random number in range of 0 and 2 * PI.
r3 (float) – Random number in range [Rmin, Rmax].
r4 (float) – Random number in range [0, 1].
task (Task) – Optimization task.
- Returns
New individual that is moved based on individual
x
.- Return type
numpy.ndarray
-
runIteration
(task, P, P_f, xb, fxb, **dparams)[source]¶ Core function of Sine Cosine Algorithm.
- Parameters
task (Task) – Optimization task.
P (numpy.ndarray) – Current population individuals.
P_f (numpy.ndarray[float]) – Current population individulas function/fitness values.
xb (numpy.ndarray) – Current best solution to optimization task.
fxb (float) – Current best function/fitness value.
dparams (Dict[str, Any]) – Additional parameters.
- Returns
New population.
New populations fitness/function values.
New global best solution.
New global best fitness/objective value.
Additional arguments.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
NiaPy.algorithms.basic.
multiMutations
(pop, i, xb, F, CR, rnd, task, itype, strategies, **kwargs)[source]¶ Mutation strategy that takes more than one strategy and applys them to individual.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
i (int) – Index of current individual.
xb (Individual) – Current best individual.
F (float) – Scale factor.
CR (float) – Crossover probability.
rnd (mtrand.RandomState) – Random generator.
task (Task) – Optimization task.
IndividualType (Individual) – Individual type used in algorithm.
strategies (Iterable[Callable[[numpy.ndarray[Individual], int, Individual, float, float, mtrand.RandomState], numpy.ndarray[Individual]]]) – List of mutation strategies.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Best individual from applyed mutations strategies.
- Return type
NiaPy.algorithms.modified
¶
Implementation of modified nature-inspired algorithms.
-
class
NiaPy.algorithms.modified.
AdaptiveArchiveDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Adaptive Differential Evolution With Optional External Archive algorithm.
- Algorithm:
Adaptive Differential Evolution With Optional External Archive
- Date:
2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Zhang, Jingqiao, and Arthur C. Sanderson. “JADE: adaptive differential evolution with optional external archive.” IEEE Transactions on evolutionary computation 13.5 (2009): 945-958.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AdaptiveArchiveDifferentialEvolution', 'JADE']¶
-
static
algorithmInfo
()[source]¶ Get algorithm information.
- Returns
Alogrithm information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO
- Return type
Dict[str, Any]
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Differential Evolution algorithm.
-
setParameters
(**kwargs)[source]¶ Set the algorithm parameters.
- Parameters
NP (Optional[int]) – Population size.
F (Optional[float]) – Scaling factor.
CR (Optional[float]) – Crossover rate.
CrossMutt (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, list], numpy.ndarray]]) – Crossover and mutation strategy.
ukwargs (Dict[str, Any]) – Additional arguments.
-
class
NiaPy.algorithms.modified.
AdaptiveBatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Adaptive bat algorithm.
- Algorithm:
Adaptive bat algorithm
- Date:
April 2019
- Authors:
Klemen Berkovič
- License:
MIT
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AdaptiveBatAlgorithm', 'ABA']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
getParameters
()[source]¶ Get algorithm parameters.
- Returns
Arguments values.
- Return type
Dict[str, Any]
See also
NiaPy.algorithms.algorithm.Algorithm.getParameters()
-
initPopulation
(task)[source]¶ Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
A (float): Loudness.
S (numpy.ndarray): TODO
Q (numpy.ndarray[float]): TODO
v (numpy.ndarray[float]): TODO
- Return type
-
localSearch
(best, A, task, **kwargs)[source]¶ Improve the best solution according to the Yang (2010).
-
runIteration
(task, Sol, Fitness, xb, fxb, A, S, Q, v, **dparams)[source]¶ Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
Sol (numpy.ndarray) – Current population
Fitness (numpy.ndarray[float]) – Current population fitness/funciton values
best (numpy.ndarray) – Current best individual
f_min (float) – Current best individual function/fitness value
S (numpy.ndarray) – TODO
Q (numpy.ndarray[float]) – TODO
v (numpy.ndarray[float]) – TODO
dparams (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function vlues
- Additional arguments:
A (numpy.ndarray[float]): Loudness.
S (numpy.ndarray): TODO
Q (numpy.ndarray[float]): TODO
v (numpy.ndarray[float]): TODO
- Return type
-
setParameters
(NP=100, A=0.5, epsilon=0.001, alpha=1.0, r=0.5, Qmin=0.0, Qmax=2.0, **ukwargs)[source]¶ Set the parameters of the algorithm.
-
static
typeParameters
()[source]¶ Return dict with where key of dict represents parameter name and values represent checking functions for selected parameter.
- Returns
epsilon (Callable[[Union[float, int]], bool]): Scale factor.
alpha (Callable[[Union[float, int]], bool]): Constant for updating loudness.
r (Callable[[Union[float, int]], bool]): Pulse rate.
Qmin (Callable[[Union[float, int]], bool]): Minimum frequency.
Qmax (Callable[[Union[float, int]], bool]): Maximum frequency.
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.modified.
AgingSelfAdaptiveDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.jde.SelfAdaptiveDifferentialEvolution
Implementation of Dynamic population size with aging self-adaptive differential evolution algorithm.
- Algorithm:
Dynamic population size with aging self-adaptive self adaptive differential evolution algorithm
- Date:
2018
- Author:
Jan Popič and Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Brest, Janez, and Mirjam Sepesy Maučec. Population size reduction for the differential evolution algorithm. Applied Intelligence 29.3 (2008): 228-247.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AgingSelfAdaptiveDifferentialEvolution', 'ANpjDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
setParameters
(LT_min=1, LT_max=7, age=<function proportional>, **ukwargs)[source]¶ Set core parameters of AgingSelfAdaptiveDifferentialEvolution algorithm.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
F_l (Callable[[Union[float, int]], bool])
F_u (Callable[[Union[float, int]], bool])
Tao1 (Callable[[Union[float, int]], bool])
Tao2 (Callable[[Union[float, int]], bool])
- Return type
Dict[str, Callable]
-
NiaPy.algorithms.modified.
CrossRandCurr2Pbest
(pop, ic, x_b, f, cr, p=0.2, arc=None, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, *args)[source]¶ Mutation strategy with crossover.
Mutation strategy uses two different random individuals from population to perform mutation.
- Mutation:
Name: DE/curr2pbest/1
- Parameters
pop (numpy.ndarray) – Current population.
ic (int) – Index of current individual.
x_b (numpy.ndarray) – Global best individual.
f (float) – Scale factor.
cr (float) – Crossover probability.
p (float) – Procentage of best individuals to use.
arc (numpy.ndarray) – Achived individuals.
rnd (mtrand.RandomState) – Random generator.
args (Dict[str, Any]) – Additional argumets.
- Returns
New position.
- Return type
numpy.ndarray
-
class
NiaPy.algorithms.modified.
DifferentialEvolutionMTS
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
,NiaPy.algorithms.other.mts.MultipleTrajectorySearch
Implementation of Differential Evolution with MTS local searches.
- Algorithm:
Differential Evolution withm MTS local searches
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
- Variables
Name (List[str]) – List of strings representing algorithm names.
LSs (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
BONUS1 (int) – Bonus for improving global best solution.
BONUS2 (int) – Bonus for improving solution.
NoLsTests (int) – Number of test runs on local search algorithms.
NoLs (int) – Number of local search algorithm runs.
NoEnabled (int) – Number of best solution for testing.
See also
NiaPy.algorithms.basic.de.DifferentialEvolution
NiaPy.algorithms.other.mts.MultipleTrajectorySearch
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DifferentialEvolutionMTS', 'DEMTS']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO
- Return type
Dict[str, Any]
-
setParameters
(NoLsTests=1, NoLs=2, NoEnabled=2, BONUS1=10, BONUS2=2, LSs=(<function MTS_LS1>, <function MTS_LS2>, <function MTS_LS3>), **ukwargs)[source]¶ Set the algorithm parameters.
- Parameters
SR (numpy.ndarray) – Search range.
See also
NiaPy.algorithms.basic.de.DifferentialEvolution.setParameters()
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
NoLsTests (Callable[[int], bool]): TODO
NoLs (Callable[[int], bool]): TODO
NoEnabled (Callable[[int], bool]): TODO
- Return type
Dict[str, Callable]
See also
NiaPy.algorithms.basic.de.DifferentialEvolution.typeParameters()
-
class
NiaPy.algorithms.modified.
DifferentialEvolutionMTSv1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.DifferentialEvolutionMTS
Implementation of Differential Evolution withm MTSv1 local searches.
- Algorithm:
Differential Evolution withm MTSv1 local searches
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DifferentialEvolutionMTSv1', 'DEMTSv1']¶
-
class
NiaPy.algorithms.modified.
DynNpDifferentialEvolutionMTS
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.DifferentialEvolutionMTS
,NiaPy.algorithms.basic.de.DynNpDifferentialEvolution
Implementation of Differential Evolution withm MTS local searches dynamic and population size.
- Algorithm:
Differential Evolution withm MTS local searches and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
NiaPy.algorithms.basic.de.DynNpDifferentialEvolution
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpDifferentialEvolutionMTS', 'dynNpDEMTS']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
class
NiaPy.algorithms.modified.
DynNpDifferentialEvolutionMTSv1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.DynNpDifferentialEvolutionMTS
Implementation of Differential Evolution withm MTSv1 local searches and dynamic population size.
- Algorithm:
Differential Evolution with MTSv1 local searches and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
NiaPy.algorithms.modified.hde.DifferentialEvolutionMTS
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpDifferentialEvolutionMTSv1', 'dynNpDEMTSv1']¶
-
class
NiaPy.algorithms.modified.
DynNpMultiStrategyDifferentialEvolutionMTS
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.MultiStrategyDifferentialEvolutionMTS
,NiaPy.algorithms.modified.hde.DynNpDifferentialEvolutionMTS
Implementation of Differential Evolution withm MTS local searches, multiple mutation strategys and dynamic population size.
- Algorithm:
Differential Evolution withm MTS local searches, multiple mutation strategys and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpMultiStrategyDifferentialEvolutionMTS', 'dynNpMSDEMTS']¶
-
class
NiaPy.algorithms.modified.
DynNpMultiStrategyDifferentialEvolutionMTSv1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.DynNpMultiStrategyDifferentialEvolutionMTS
Implementation of Differential Evolution withm MTSv1 local searches, multiple mutation strategys and dynamic population size.
- Algorithm:
Differential Evolution withm MTSv1 local searches, multiple mutation strategys and dynamic population size
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
NiaPy.algorithm.modified.DynNpMultiStrategyDifferentialEvolutionMTS
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpMultiStrategyDifferentialEvolutionMTSv1', 'dynNpMSDEMTSv1']¶
-
class
NiaPy.algorithms.modified.
DynNpMultiStrategySelfAdaptiveDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.jde.MultiStrategySelfAdaptiveDifferentialEvolution
,NiaPy.algorithms.modified.jde.DynNpSelfAdaptiveDifferentialEvolutionAlgorithm
Implementation of Dynamic population size self-adaptive differential evolution algorithm with multiple mutation strategies.
- Algorithm:
Dynamic population size self-adaptive differential evolution algorithm with multiple mutation strategies
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpMultiStrategySelfAdaptiveDifferentialEvolution', 'dynNpMsjDE']¶
-
postSelection
(pop, task, **kwargs)[source]¶ Apply additional operation after selection.
- Parameters
pop (numpy.ndarray) – Current population.
task (Task) – Optimization task.
xb (numpy.ndarray) – Global best solution.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population.
New global best solution.
New global best solutions fitness/objective value.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, float]
-
class
NiaPy.algorithms.modified.
DynNpSelfAdaptiveDifferentialEvolutionAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.jde.SelfAdaptiveDifferentialEvolution
,NiaPy.algorithms.basic.de.DynNpDifferentialEvolution
Implementation of Dynamic population size self-adaptive differential evolution algorithm.
- Algorithm:
Dynamic population size self-adaptive differential evolution algorithm
- Date:
2018
- Author:
Jan Popič and Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Brest, Janez, and Mirjam Sepesy Maučec. Population size reduction for the differential evolution algorithm. Applied Intelligence 29.3 (2008): 228-247.
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['DynNpSelfAdaptiveDifferentialEvolutionAlgorithm', 'dynNPjDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
postSelection
(pop, task, **kwargs)[source]¶ Post selection operator.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
task (Task) – Optimization task.
- Returns
New population.
- Return type
numpy.ndarray[Individual]
-
class
NiaPy.algorithms.modified.
HybridBatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.ba.BatAlgorithm
Implementation of Hybrid bat algorithm.
- Algorithm:
Hybrid bat algorithm
- Date:
2018
- Author:
Grega Vrbancic and Klemen Berkovič
- License:
MIT
- Reference paper:
Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniski vestnik, 2013. 1-7.
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HybridBatAlgorithm', 'HBA']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
class
NiaPy.algorithms.modified.
HybridSelfAdaptiveBatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.saba.SelfAdaptiveBatAlgorithm
Implementation of Hybrid self adaptive bat algorithm.
- Algorithm:
Hybrid self adaptive bat algorithm
- Date:
April 2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference paper:
Fister, Iztok, Simon Fong, and Janez Brest. “A novel hybrid self-adaptive bat algorithm.” The Scientific World Journal 2014 (2014).
- Reference URL:
- Variables
Name (List[str]) – List of strings representing algorithm name.
F (float) – Scaling factor for local search.
CR (float) – Probability of crossover for local search.
CrossMutt (Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, Dict[str, Any]) – Local search method based of Differential evolution strategy.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HybridSelfAdaptiveBatAlgorithm', 'HSABA']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameters of the algorithm.
- Return type
Dict[str, Any]
-
setParameters
(F=0.9, CR=0.85, CrossMutt=<function CrossBest1>, **ukwargs)[source]¶ Set core parameters of HybridBatAlgorithm algorithm.
- Parameters
F (Optional[float]) – Scaling factor for local search.
CR (Optional[float]) – Probability of crossover for local search.
CrossMutt (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, Dict[str, Any], numpy.ndarray]]) – Local search method based of Differential evolution strategy.
ukwargs (Dict[str, Any]) – Additional arguments.
-
class
NiaPy.algorithms.modified.
MultiStrategyDifferentialEvolutionMTS
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.DifferentialEvolutionMTS
,NiaPy.algorithms.basic.de.MultiStrategyDifferentialEvolution
Implementation of Differential Evolution withm MTS local searches and multiple mutation strategys.
- Algorithm:
Differential Evolution withm MTS local searches and multiple mutation strategys
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
See also
NiaPy.algorithms.modified.hde.DifferentialEvolutionMTS
NiaPy.algorithms.basic.de.MultiStrategyDifferentialEvolution
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MultiStrategyDifferentialEvolutionMTS', 'MSDEMTS']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population of individuals.
xb (Individual) – Global best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
Evolved population.
- Return type
numpy.ndarray[Individual]
-
class
NiaPy.algorithms.modified.
MultiStrategyDifferentialEvolutionMTSv1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.hde.MultiStrategyDifferentialEvolutionMTS
Implementation of Differential Evolution with MTSv1 local searches and multiple mutation strategys.
- Algorithm:
Differential Evolution withm MTSv1 local searches and multiple mutation strategys
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MultiStrategyDifferentialEvolutionMTSv1', 'MSDEMTSv1']¶
-
class
NiaPy.algorithms.modified.
MultiStrategySelfAdaptiveDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.jde.SelfAdaptiveDifferentialEvolution
Implementation of self-adaptive differential evolution algorithm with multiple mutation strategys.
- Algorithm:
Self-adaptive differential evolution algorithm with multiple mutation strategys
- Date:
2018
- Author:
Klemen Berkovič
- License:
MIT
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MultiStrategySelfAdaptiveDifferentialEvolution', 'MsjDE']¶
-
evolve
(pop, xb, task, **kwargs)[source]¶ Evolve population with the help multiple mutation strategies.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
xb (Individual) – Current best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population of individuals.
- Return type
numpy.ndarray[Individual]
-
setParameters
(strategies=(<function CrossCurr2Rand1>, <function CrossCurr2Best1>, <function CrossRand1>, <function CrossBest1>, <function CrossBest2>), **kwargs)[source]¶ Set core parameters of MultiStrategySelfAdaptiveDifferentialEvolution algorithm.
- Parameters
strategys (Optional[Iterable[Callable]]) – Mutations strategies to use in algorithm.
**kwargs –
-
class
NiaPy.algorithms.modified.
ParameterFreeBatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Parameter-free Bat algorithm.
- Algorithm:
Parameter-free Bat algorithm
- Date:
2020
- Authors:
Iztok Fister Jr. This implementation is based on the implementation of basic BA from NiaPy
- License:
MIT
- Reference paper:
Iztok Fister Jr., Iztok Fister, Xin-She Yang. Towards the development of a parameter-free bat algorithm . In: FISTER Jr., Iztok (Ed.), BRODNIK, Andrej (Ed.). StuCoSReC : proceedings of the 2015 2nd Student Computer Science Research Conference. Koper: University of Primorska, 2015, pp. 31-34.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['ParameterFreeBatAlgorithm', 'PLBA']¶
-
static
algorithmInfo
()[source]¶ Get algorithms information.
- Returns
Algorithm information.
- Return type
-
initPopulation
(task)[source]¶ Initialize the initial population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
S (numpy.ndarray): Solutions
Q (numpy.ndarray[float]): Frequencies
v (numpy.ndarray[float]): Velocities
- Return type
-
localSearch
(best, task, **kwargs)[source]¶ Improve the best solution according to the Yang (2010).
- Parameters
best (numpy.ndarray) – Global best individual.
task (Task) – Optimization task.
**kwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution based on global best individual.
- Return type
numpy.ndarray
-
runIteration
(task, Sol, Fitness, xb, fxb, S, Q, v, **dparams)[source]¶ Core function of Parameter-free Bat Algorithm.
- Parameters
task (Task) – Optimization task.
Sol (numpy.ndarray) – Current population
Fitness (numpy.ndarray[float]) – Current population fitness/funciton values
best (numpy.ndarray) – Current best individual
f_min (float) – Current best individual function/fitness value
S (numpy.ndarray) – Solutions
Q (numpy.ndarray) – Frequencies
v (numpy.ndarray) – Velocities
best – Global best used by the algorithm
f_min – Global best fitness value used by the algorithm
dparams (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function vlues
New global best solution
New global best fitness/objective value
- Additional arguments:
S (numpy.ndarray): Solutions
Q (numpy.ndarray): Frequencies
v (numpy.ndarray): Velocities
best (numpy.ndarray): Global best
f_min (float): Global best fitness
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
class
NiaPy.algorithms.modified.
SelfAdaptiveBatAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.modified.saba.AdaptiveBatAlgorithm
Implementation of Hybrid bat algorithm.
- Algorithm:
Hybrid bat algorithm
- Date:
April 2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference paper:
Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniski vestnik, 2013. 1-7.
- Variables
Name (List[str]) – List of strings representing algorithm name.
A_l (Optional[float]) – Lower limit of loudness.
A_u (Optional[float]) – Upper limit of loudness.
r_l (Optional[float]) – Lower limit of pulse rate.
r_u (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['SelfAdaptiveBatAlgorithm', 'SABA']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameters of the algorithm.
- Return type
Dict[str, Any]
-
initPopulation
(task)[source]¶ Initialize the starting population.
- Parameters
task (Task) – Optimization task
- Returns
New population.
New population fitness/function values.
- Additional arguments:
A (float): Loudness.
S (numpy.ndarray): TODO
Q (numpy.ndarray[float]): TODO
v (numpy.ndarray[float]): TODO
- Return type
-
runIteration
(task, Sol, Fitness, xb, fxb, A, r, S, Q, v, **dparams)[source]¶ Core function of Bat Algorithm.
- Parameters
task (Task) – Optimization task.
Sol (numpy.ndarray) – Current population
Fitness (numpy.ndarray[float]) – Current population fitness/funciton values
xb (numpy.ndarray) – Current best individual
fxb (float) – Current best individual function/fitness value
A (numpy.ndarray[flaot]) – Loudness of individuals.
r (numpy.ndarray[float[) – Pulse rate of individuals.
S (numpy.ndarray) – TODO
Q (numpy.ndarray[float]) – TODO
v (numpy.ndarray[float]) – TODO
dparams (Dict[str, Any]) – Additional algorithm arguments
- Returns
New population
New population fitness/function vlues
- Additional arguments:
A (numpy.ndarray[float]): Loudness.
r (numpy.ndarray[float]): Pulse rate.
S (numpy.ndarray): TODO
Q (numpy.ndarray[float]): TODO
v (numpy.ndarray[float]): TODO
- Return type
-
setParameters
(A_l=0.9, A_u=1.0, r_l=0.001, r_u=0.1, tao_1=0.1, tao_2=0.1, **ukwargs)[source]¶ Set core parameters of HybridBatAlgorithm algorithm.
- Parameters
A_l (Optional[float]) – Lower limit of loudness.
A_u (Optional[float]) – Upper limit of loudness.
r_l (Optional[float]) – Lower limit of pulse rate.
r_u (Optional[float]) – Upper limit of pulse rate.
tao_1 (Optional[float]) – Learning rate for loudness.
tao_2 (Optional[float]) – Learning rate for pulse rate.
-
class
NiaPy.algorithms.modified.
SelfAdaptiveDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Self-adaptive differential evolution algorithm.
- Algorithm:
Self-adaptive differential evolution algorithm
- Date:
2018
- Author:
Uros Mlakar and Klemen Berkovič
- License:
MIT
- Reference paper:
Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V. Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE transactions on evolutionary computation, 10(6), 646-657, 2006.
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
AdaptiveGen
(x)[source]¶ Adaptive update scale factor in crossover probability.
- Parameters
x (Individual) – Individual to apply function on.
- Returns
New individual with new parameters
- Return type
-
Name
= ['SelfAdaptiveDifferentialEvolution', 'jDE']¶
-
static
algorithmInfo
()[source]¶ Get algorithm information.
- Returns
Algorithm information.
- Return type
-
evolve
(pop, xb, task, **ukwargs)[source]¶ Evolve current population.
- Parameters
pop (numpy.ndarray[Individual]) – Current population.
xb (Individual) – Global best individual.
task (Task) – Optimization task.
ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New population.
- Return type
numpy.ndarray
-
setParameters
(F_l=0.0, F_u=1.0, Tao1=0.4, Tao2=0.2, **ukwargs)[source]¶ Set the parameters of an algorithm.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
F_l (Callable[[Union[float, int]], bool])
F_u (Callable[[Union[float, int]], bool])
Tao1 (Callable[[Union[float, int]], bool])
Tao2 (Callable[[Union[float, int]], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.modified.
StrategyAdaptationDifferentialEvolution
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential Evolution Algorithm With Strategy Adaptation algorihtm.
- Algorithm:
Differential Evolution Algorithm With StrategyAdaptation
- Date:
2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Qin, A. Kai, and Ponnuthurai N. Suganthan. “Self-adaptive differential evolution algorithm for numerical optimization.” 2005 IEEE congress on evolutionary computation. Vol. 2. IEEE, 2005.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['StrategyAdaptationDifferentialEvolution', 'SADE', 'SaDE']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO
- Return type
Dict[str, Any]
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Differential Evolution algorithm.
-
setParameters
(**kwargs)[source]¶ Set the algorithm parameters.
- Parameters
NP (Optional[int]) – Population size.
F (Optional[float]) – Scaling factor.
CR (Optional[float]) – Crossover rate.
CrossMutt (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, list], numpy.ndarray]]) – Crossover and mutation strategy.
ukwargs (Dict[str, Any]) – Additional arguments.
-
class
NiaPy.algorithms.modified.
StrategyAdaptationDifferentialEvolutionV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.basic.de.DifferentialEvolution
Implementation of Differential Evolution Algorithm With Strategy Adaptation algorihtm.
- Algorithm:
Differential Evolution Algorithm With StrategyAdaptation
- Date:
2019
- Author:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Reference paper:
Qin, A. Kai, Vicky Ling Huang, and Ponnuthurai N. Suganthan. “Differential evolution algorithm with strategy adaptation for global numerical optimization.” IEEE transactions on Evolutionary Computation 13.2 (2009): 398-417.
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['StrategyAdaptationDifferentialEvolutionV1', 'SADEV1', 'SaDEV1']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
getParameters
()[source]¶ Get parameters values of the algorithm.
- Returns
TODO
- Return type
Dict[str, Any]
-
runIteration
(task, pop, fpop, xb, fxb, **dparams)[source]¶ Core function of Differential Evolution algorithm.
-
setParameters
(**kwargs)[source]¶ Set the algorithm parameters.
- Parameters
NP (Optional[int]) – Population size.
F (Optional[float]) – Scaling factor.
CR (Optional[float]) – Crossover rate.
CrossMutt (Optional[Callable[[numpy.ndarray, int, numpy.ndarray, float, float, mtrand.RandomState, list], numpy.ndarray]]) – Crossover and mutation strategy.
ukwargs (Dict[str, Any]) – Additional arguments.
NiaPy.algorithms.other
¶
Implementation of other algorithms.
-
class
NiaPy.algorithms.other.
AnarchicSocietyOptimization
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Anarchic Society Optimization algorithm.
- Algorithm:
Anarchic Society Optimization algorithm
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference paper:
Ahmadi-Javid, Amir. “Anarchic Society Optimization: A human-inspired method.” Evolutionary Computation (CEC), 2011 IEEE Congress on. IEEE, 2011.
- Variables
Name (list of str) – List of stings representing name of algorithm.
alpha (List[float]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (List[float]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (List[float]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Callable[[float, float], float]) – function that takes two arguments that are function values and calcs the distance between them.
dn (Callable[[numpy.ndarray, numpy.ndarray], float]) – function that takes two arguments that are points in function landscape and calcs the distance between them.
nl (float) – Normalized range for neighborhood search \(\in (0, 1]\).
F (float) – Mutation parameter.
CR (float) – Crossover parameter \(\in [0, 1]\).
Combination (Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, mtrand.RandomState]) – Function for combining individuals to get new position/individual.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['AnarchicSocietyOptimization', 'ASO']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
getBestNeighbors
(i, X, X_f, rs)[source]¶ Get neighbors of individual.
Mesurment of distance for neighborhud is defined with self.nl. Function for calculating distances is define with self.dn.
- Parameters
- Returns
Indexes that represent individuals closest to i-th individual.
- Return type
numpy.ndarray[int]
-
init
(task)[source]¶ Initialize dynamic parameters of algorithm.
- Parameters
task (Task) – Optimization task.
- Returns
- Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray]
Array of self.alpha propagated values
Array of self.gamma propagated values
Array of self.theta propagated values
-
initPopulation
(task)[source]¶ Initialize first population and additional arguments.
- Parameters
task (Task) – Optimization task
- Returns
Initialized population
Initialized population fitness/function values
- Dict[str, Any]:
Xpb (numpy.ndarray): Initialized populations best positions.
Xpb_f (numpy.ndarray): Initialized populations best positions function/fitness values.
alpha (numpy.ndarray):
gamma (numpy.ndarray):
theta (numpy.ndarray):
rs (float): Distance of search space.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, dict]
See also
NiaPy.algorithms.algorithm.Algorithm.initPopulation()
NiaPy.algorithms.other.aso.AnarchicSocietyOptimization.init()
-
runIteration
(task, X, X_f, xb, fxb, Xpb, Xpb_f, alpha, gamma, theta, rs, **dparams)[source]¶ Core function of AnarchicSocietyOptimization algorithm.
- Parameters
task (Task) – Optimization task.
X (numpy.ndarray) – Current populations positions.
X_f (numpy.ndarray) – Current populations function/fitness values.
xb (numpy.ndarray) – Current global best individuals position.
fxb (float) – Current global best individual function/fitness value.
Xpb (numpy.ndarray) – Current populations best positions.
Xpb_f (numpy.ndarray) – Current population best positions function/fitness values.
alpha (numpy.ndarray) – TODO.
gamma (numpy.ndarray) –
theta (numpy.ndarray) –
**dparams – Additional arguments.
- Returns
Initialized population
Initialized population fitness/function values
New global best solution
New global best solutions fitness/objective value
- Dict[str, Union[float, int, numpy.ndarray]:
Xpb (numpy.ndarray): Initialized populations best positions.
Xpb_f (numpy.ndarray): Initialized populations best positions function/fitness values.
alpha (numpy.ndarray):
gamma (numpy.ndarray):
theta (numpy.ndarray):
rs (float): Distance of search space.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, dict]
-
setParameters
(NP=43, alpha=(1, 0.83), gamma=(1.17, 0.56), theta=(0.932, 0.832), d=<function euclidean>, dn=<function euclidean>, nl=1, F=1.2, CR=0.25, Combination=<function Elitism>, **ukwargs)[source]¶ Set the parameters for the algorith.
- Parameters
alpha (Optional[List[float]]) – Factor for fickleness index function \(\in [0, 1]\).
gamma (Optional[List[float]]) – Factor for external irregularity index function \(\in [0, \infty)\).
theta (Optional[List[float]]) – Factor for internal irregularity index function \(\in [0, \infty)\).
d (Optional[Callable[[float, float], float]]) – function that takes two arguments that are function values and calcs the distance between them.
dn (Optional[Callable[[numpy.ndarray, numpy.ndarray], float]]) – function that takes two arguments that are points in function landscape and calcs the distance between them.
nl (Optional[float]) – Normalized range for neighborhood search \(\in (0, 1]\).
F (Optional[float]) – Mutation parameter.
CR (Optional[float]) – Crossover parameter \(\in [0, 1]\).
Combination (Optional[Callable[numpy.ndarray, numpy.ndarray, numpy.ndarray, numpy.ndarray, float, float, float, float, float, float, Task, mtrand.RandomState]]) – Function for combining individuals to get new position/individual.
See also
- Combination methods:
NiaPy.algorithms.other.Elitism()
NiaPy.algorithms.other.Crossover()
NiaPy.algorithms.other.Sequential()
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
alpha (Callable): TODO
gamma (Callable): TODO
theta (Callable): TODO
nl (Callable): TODO
F (Callable[[Union[float, int]], bool]): TODO
CR (Callable[[Union[float, int]], bool]): TODO
- Return type
Dict[str, Callable]
-
uBestAndPBest
(X, X_f, Xpb, Xpb_f)[source]¶ Update personal best solution of all individuals in population.
-
class
NiaPy.algorithms.other.
HillClimbAlgorithm
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of iterative hill climbing algorithm.
- Algorithm:
Hill Climbing Algorithm
- Date:
2018
- Authors:
Jan Popič
- License:
MIT
Reference URL:
Reference paper:
See also
- Variables
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['HillClimbAlgorithm', 'BBFA']¶
-
static
algorithmInfo
()[source]¶ Get basic information about the algorithm.
- Returns
Basic information.
- Return type
See also
NiaPy.algorithms.algorithm.Algorithm.algorithmInfo()
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
runIteration
(task, x, fx, xb, fxb, **dparams)[source]¶ Core function of HillClimbAlgorithm algorithm.
-
NiaPy.algorithms.other.
MTS_LS1
(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, sr_fix=0.4, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **ukwargs)[source]¶ Multiple trajectory local search one.
- Parameters
Xk (numpy.ndarray) – Current solution.
Xk_fit (float) – Current solutions fitness/function value.
Xb (numpy.ndarray) – Global best solution.
Xb_fit (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
BONUS1 (int) – Bonus reward for improving global best solution.
BONUS2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
rnd (mtrand.RandomState) – Random number generator.
**ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
-
NiaPy.algorithms.other.
MTS_LS1v1
(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, sr_fix=0.4, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **ukwargs)[source]¶ Multiple trajectory local search one version two.
- Parameters
Xk (numpy.ndarray) – Current solution.
Xk_fit (float) – Current solutions fitness/function value.
Xb (numpy.ndarray) – Global best solution.
Xb_fit (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
BONUS1 (int) – Bonus reward for improving global best solution.
BONUS2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
rnd (mtrand.RandomState) – Random number generator.
**ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
-
NiaPy.algorithms.other.
MTS_LS2
(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, sr_fix=0.4, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **ukwargs)[source]¶ Multiple trajectory local search two.
- Parameters
Xk (numpy.ndarray) – Current solution.
Xk_fit (float) – Current solutions fitness/function value.
Xb (numpy.ndarray) – Global best solution.
Xb_fit (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
BONUS1 (int) – Bonus reward for improving global best solution.
BONUS2 (int) – Bonus reward for improving solution.
sr_fix (numpy.ndarray) – Fix when search range is to small.
rnd (mtrand.RandomState) – Random number generator.
**ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
See also
NiaPy.algorithms.other.genNewX()
-
NiaPy.algorithms.other.
MTS_LS3
(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **ukwargs)[source]¶ Multiple trajectory local search three.
- Parameters
Xk (numpy.ndarray) – Current solution.
Xk_fit (float) – Current solutions fitness/function value.
Xb (numpy.ndarray) – Global best solution.
Xb_fit (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
BONUS1 (int) – Bonus reward for improving global best solution.
BONUS2 (int) – Bonus reward for improving solution.
rnd (mtrand.RandomState) – Random number generator.
**ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
-
NiaPy.algorithms.other.
MTS_LS3v1
(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, phi=3, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/.pyenv/versions/3.6.12/lib/python3.6/site-packages/numpy/random/__init__.py'>, **ukwargs)[source]¶ Multiple trajectory local search three version one.
- Parameters
Xk (numpy.ndarray) – Current solution.
Xk_fit (float) – Current solutions fitness/function value.
Xb (numpy.ndarray) – Global best solution.
Xb_fit (float) – Global best solutions fitness/function value.
improve (bool) – Has the solution been improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
phi (int) – Number of new generated positions.
BONUS1 (int) – Bonus reward for improving global best solution.
BONUS2 (int) – Bonus reward for improving solution.
rnd (mtrand.RandomState) – Random number generator.
**ukwargs (Dict[str, Any]) – Additional arguments.
- Returns
New solution.
New solutions fitness/function value.
Global best if found else old global best.
Global bests function/fitness value.
If solution has improved.
Search range.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray]
-
class
NiaPy.algorithms.other.
MultipleTrajectorySearch
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Multiple trajectory search.
- Algorithm:
Multiple trajectory search
- Date:
2018
- Authors:
Klemen Berkovic
- License:
MIT
- Reference URL:
- Reference paper:
Lin-Yu Tseng and Chun Chen, “Multiple trajectory search for Large Scale Global Optimization,” 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, 2008, pp. 3052-3059. doi: 10.1109/CEC.2008.4631210
- Variables
Name (List[Str]) – List of strings representing algorithm name.
LSs (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
BONUS1 (int) – Bonus for improving global best solution.
BONUS2 (int) – Bonus for improving solution.
NoLsTests (int) – Number of test runs on local search algorithms.
NoLs (int) – Number of local search algorithm runs.
NoLsBest (int) – Number of locals search algorithm runs on best solution.
NoEnabled (int) – Number of best solution for testing.
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
GradingRun
(x, x_f, xb, fxb, improve, SR, task)[source]¶ Run local search for getting scores of local searches.
- Parameters
x (numpy.ndarray) – Solution for grading.
x_f (float) – Solutions fitness/function value.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions function/fitness value.
improve (bool) – Info if solution has improved.
SR (numpy.ndarray) – Search range.
task (Task) – Optimization task.
- Returns
New solution.
New solutions function/fitness value.
Global best solution.
Global best solutions fitness/function value.
- Return type
-
LsRun
(k, x, x_f, xb, fxb, improve, SR, g, task)[source]¶ Run a selected local search.
- Parameters
k (int) – Index of local search.
x (numpy.ndarray) – Current solution.
x_f (float) – Current solutions function/fitness value.
xb (numpy.ndarray) – Global best solution.
fxb (float) – Global best solutions fitness/function value.
improve (bool) – If the solution has improved.
SR (numpy.ndarray) – Search range.
g (int) – Grade.
task (Task) – Optimization task.
- Returns
New best solution found.
New best solutions found function/fitness value.
Global best solution.
Global best solutions function/fitness value.
If the solution has improved.
Grade of local search run.
- Return type
Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, int]
-
Name
= ['MultipleTrajectorySearch', 'MTS']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
getParameters
()[source]¶ Get parameters values for the algorithm.
- Returns
- Return type
Dict[str, Any]
-
initPopulation
(task)[source]¶ Initialize starting population.
- Parameters
task (Task) – Optimization task.
- Returns
Initialized population.
Initialized populations function/fitness value.
- Additional arguments:
enable (numpy.ndarray): If solution/individual is enabled.
improve (numpy.ndarray): If solution/individual is improved.
SR (numpy.ndarray): Search range.
grades (numpy.ndarray): Grade of solution/individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, Dict[str, Any]]
-
runIteration
(task, X, X_f, xb, xb_f, enable, improve, SR, grades, **dparams)[source]¶ Core function of MultipleTrajectorySearch algorithm.
- Parameters
task (Task) – Optimization task.
X (numpy.ndarray) – Current population of individuals.
X_f (numpy.ndarray) – Current individuals function/fitness values.
xb (numpy.ndarray) – Global best individual.
xb_f (float) – Global best individual function/fitness value.
enable (numpy.ndarray) – Enabled status of individuals.
improve (numpy.ndarray) – Improved status of individuals.
SR (numpy.ndarray) – Search ranges of individuals.
grades (numpy.ndarray) – Grades of individuals.
**dparams (Dict[str, Any]) – Additional arguments.
- Returns
Initialized population.
Initialized populations function/fitness value.
New global best solution.
New global best solutions fitness/objective value.
- Additional arguments:
enable (numpy.ndarray): If solution/individual is enabled.
improve (numpy.ndarray): If solution/individual is improved.
SR (numpy.ndarray): Search range.
grades (numpy.ndarray): Grade of solution/individual.
- Return type
Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]
-
setParameters
(M=40, NoLsTests=5, NoLs=5, NoLsBest=5, NoEnabled=17, BONUS1=10, BONUS2=1, LSs=(<function MTS_LS1>, <function MTS_LS2>, <function MTS_LS3>), **ukwargs)[source]¶ Set the arguments of the algorithm.
- Parameters
M (int) – Number of individuals in population.
NoLsTests (int) – Number of test runs on local search algorithms.
NoLs (int) – Number of local search algorithm runs.
NoLsBest (int) – Number of locals search algorithm runs on best solution.
NoEnabled (int) – Number of best solution for testing.
BONUS1 (int) – Bonus for improving global best solution.
BONUS2 (int) – Bonus for improving self.
LSs (Iterable[Callable[[numpy.ndarray, float, numpy.ndarray, float, bool, numpy.ndarray, Task, Dict[str, Any]], Tuple[numpy.ndarray, float, numpy.ndarray, float, bool, int, numpy.ndarray]]]) – Local searches to use.
-
static
typeParameters
()[source]¶ Get dictionary with functions for checking values of parameters.
- Returns
M (Callable[[int], bool])
NoLsTests (Callable[[int], bool])
NoLs (Callable[[int], bool])
NoLsBest (Callable[[int], bool])
NoEnabled (Callable[[int], bool])
BONUS1 (Callable([[Union[int, float], bool])
BONUS2 (Callable([[Union[int, float], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.other.
MultipleTrajectorySearchV1
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.other.mts.MultipleTrajectorySearch
Implementation of Multiple trajectory search.
- Algorithm:
Multiple trajectory search
- Date:
2018
- Authors:
Klemen Berkovic
- License:
MIT
- Reference URL:
- Reference paper:
Tseng, Lin-Yu, and Chun Chen. “Multiple trajectory search for unconstrained/constrained multi-objective optimization.” Evolutionary Computation, 2009. CEC’09. IEEE Congress on. IEEE, 2009.
See also
NiaPy.algorithms.other.MultipleTrajectorySearch`
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['MultipleTrajectorySearchV1', 'MTSv1']¶
-
class
NiaPy.algorithms.other.
NelderMeadMethod
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Nelder Mead method or downhill simplex method or amoeba method.
- Algorithm:
Nelder Mead Method
- Date:
2018
- Authors:
Klemen Berkovič
- License:
MIT
- Reference URL:
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['NelderMeadMethod', 'NMM']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
getParameters
()[source]¶ Get parameters of the algorithm.
- Returns
Parameter name (str): Represents a parameter name
Value of parameter (Any): Represents the value of the parameter
- Return type
Dict[str, Any]
-
initPop
(task, NP, **kwargs)[source]¶ Init starting population.
- Parameters
- Returns
New initialized population.
New initialized population fitness/function values.
- Return type
Tuple[numpy.ndarray, numpy.ndarray[float]]
-
runIteration
(task, X, X_f, xb, fxb, **dparams)[source]¶ Core iteration function of NelderMeadMethod algorithm.
-
setParameters
(NP=None, alpha=0.1, gamma=0.3, rho=- 0.2, sigma=- 0.2, **ukwargs)[source]¶ Set the arguments of an algorithm.
-
static
typeParameters
()[source]¶ Get dictionary with function for testing correctness of parameters.
- Returns
alpha (Callable[[Union[int, float]], bool])
gamma (Callable[[Union[int, float]], bool])
rho (Callable[[Union[int, float]], bool])
sigma (Callable[[Union[int, float]], bool])
- Return type
Dict[str, Callable]
-
class
NiaPy.algorithms.other.
RandomSearch
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of a simple Random Algorithm.
- Algorithm:
Random Search
- Date:
11.10.2020
- Authors:
Iztok Fister Jr., Grega Vrbančič
- License:
MIT
Reference URL: https://en.wikipedia.org/wiki/Random_search
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['RandomSearch', 'RS']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
class
NiaPy.algorithms.other.
SimulatedAnnealing
(**kwargs)[source]¶ Bases:
NiaPy.algorithms.algorithm.Algorithm
Implementation of Simulated Annealing Algorithm.
- Algorithm:
Simulated Annealing Algorithm
- Date:
2018
- Authors:
Jan Popič and Klemen Berkovič
- License:
MIT
Reference URL:
Reference paper:
- Variables
See also
Initialize algorithm and create name for an algorithm.
- Parameters
seed (int) – Starting seed for random generator.
-
Name
= ['SimulatedAnnealing', 'SA']¶
-
static
algorithmInfo
()[source]¶ Get basic information of algorithm.
- Returns
Basic information of algorithm.
- Return type
-
runIteration
(task, x, xfit, xb, fxb, curT, **dparams)[source]¶ Core function of the algorithm.
- Parameters
- Returns
New solution
New solutions fitness/objective value
New global best solution
New global best solutions fitness/objective value
Additional arguments
- Return type