NiaPy.algorithms

NiaPy.algorithms.basic

Implementation of basic nature-inspired algorithms.

class NiaPy.algorithms.basic.BatAlgorithm(D, NP, nFES, A, r, Qmin, Qmax, benchmark)[source]

Bases: object

Implementation of Bat algorithm.

Algorithm: Bat algorithm

Date: 2015

Authors: Iztok Fister Jr. and Marko Burjek

License: MIT

Reference paper:
Yang, Xin-She. “A new metaheuristic bat-inspired algorithm.” Nature inspired cooperative strategies for optimization (NICSO 2010). Springer, Berlin, Heidelberg, 2010. 65-74.

__init__(self, D, NP, nFES, A, r, Qmin, Qmax, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

A {decimal} – loudness

r {decimal} – pulse rate

Qmin {decimal} – minimum frequency

Qmax {decimal } – maximum frequency

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
best_bat()[source]

Find the best bat.

eval_true()[source]

Check evaluations.

init_bat()[source]

Initialize population.

move_bat()[source]

Move bats in search space.

run()[source]

Run algorithm with initialized parameters.

Return {decimal} - best

classmethod simplebounds(val, lower, upper)[source]

Keep it within bounds.

class NiaPy.algorithms.basic.FireflyAlgorithm(D, NP, nFES, alpha, betamin, gamma, benchmark)[source]

Bases: object

Implementation of Firefly algorithm.

Algorithm: Firefly algorithm

Date: 2016

Authors: Iztok Fister Jr. and Iztok Fister

License: MIT

Reference paper:
Fister, I., Fister Jr, I., Yang, X. S., & Brest, J. (2013). A comprehensive review of firefly algorithms. Swarm and Evolutionary Computation, 13, 34-46.

__init__(self, D, NP, nFES, alpha, betamin, gamma, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

alpha {decimal} – alpha parameter

betamin {decimal} – betamin parameter

gamma {decimal} – gamma parameter

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
FindLimits(k)[source]

Find limits.

alpha_new(a)[source]

Optionally recalculate the new alpha value.

eval_true()[source]

Check evaluations.

init_ffa()[source]

Initialize firefly population.

move_ffa()[source]

Move fireflies.

replace_ffa()[source]

Replace the old population according to the new Index values.

run()[source]

Run.

sort_ffa()[source]

Implement bubble sort.

class NiaPy.algorithms.basic.DifferentialEvolutionAlgorithm(D, NP, nFES, F, CR, benchmark)[source]

Bases: object

Implementation of Differential evolution algorithm.

Algorithm: Differential evolution algorithm

Date: 2018

Author: Uros Mlakar

License: MIT

Reference paper:
Storn, Rainer, and Kenneth Price. “Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces.” Journal of global optimization 11.4 (1997): 341-359.

__init__(self, D, NP, nFES, F, CR, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

F {decimal} – scaling factor

CR {decimal} – crossover rate

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
evalPopulation()[source]

Evaluate population.

generationStep(Population)[source]

Implement main generation step.

initPopulation()[source]

Initialize population.

run()[source]

Run.

class NiaPy.algorithms.basic.FlowerPollinationAlgorithm(D, NP, nFES, p, benchmark)[source]

Bases: object

Implementation of Flower Pollination algorithm.

Algorithm: Flower Pollination algorithm

Date: 2018

Authors: Dusan Fister & Iztok Fister Jr.

License: MIT

Reference paper:
Yang, Xin-She. “Flower pollination algorithm for global optimization.” International conference on unconventional computing and natural computation. Springer, Berlin, Heidelberg, 2012.

Implementation is based on the following MATLAB code: https://www.mathworks.com/matlabcentral/fileexchange/45112-flower-pollination-algorithm?requestedDomain=true

__init__(self, D, NP, nFES, p, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

p {decimal} – probability switch

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
Levy()[source]

Levy flight.

best_flower()[source]

Check best solution.

eval_true()[source]

Check evaluations.

init_flower()[source]

Initialize flowers.

move_flower()[source]

Move in search space.

run()[source]

Run.

classmethod simplebounds(val, lower, upper)[source]

Keep it within bounds.

class NiaPy.algorithms.basic.GreyWolfOptimizer(D, NP, nFES, benchmark)[source]

Bases: object

Implementation of Grey wolf optimizer.

Algorithm: Grey wolf optimizer

Date: 2018

Author: Iztok Fister Jr.

License: MIT

Reference paper:
Mirjalili, Seyedali, Seyed Mohammad Mirjalili, and Andrew Lewis. “Grey wolf optimizer.” Advances in engineering software 69 (2014): 46-61. & Grey Wold Optimizer (GWO) source code version 1.0 (MATLAB) from MathWorks

__init__(self, D, NP, nFES, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
bounds(position)[source]

Keep it within bounds.

eval_true()[source]

Check evaluations.

initialization()[source]

Initialize positions.

move()[source]

Move wolves in search space.

run()[source]

Run.

class NiaPy.algorithms.basic.GeneticAlgorithm(D, NP, nFES, Ts, Mr, gamma, benchmark)[source]

Bases: object

Implementation of Genetic algorithm.

Algorithm: Genetic algorithm

Date: 2018

Author: Uros Mlakar

License: MIT

__init__(self, D, NP, nFES, Ts, Mr, gamma, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

Ts {integer} – tournament selection

Mr {decimal} – mutation rate

gamma {decimal} – minimum frequency

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
CrossOver(parent1, parent2)[source]

Crossover.

Mutate(child)[source]

Mutation.

TournamentSelection()[source]

Tournament selection.

checkForBest(pChromosome)[source]

Check best solution.

init()[source]

Initialize population.

run()[source]

Run.

tryEval(c)[source]

Check evaluations.

class NiaPy.algorithms.basic.ArtificialBeeColonyAlgorithm(D, NP, nFES, benchmark)[source]

Bases: object

Implementation of Artificial Bee Colony algorithm.

Algorithm: Artificial Bee Colony algorithm

Date: 2018

Author: Uros Mlakar

License: MIT

Reference paper:
Karaboga, D., and Bahriye B. “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm.” Journal of global optimization 39.3 (2007): 459-471.

__init__(self, D, NP, nFES, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
CalculateProbs()[source]

Calculate probs.

checkForBest(Solution)[source]

Check best solution.

init()[source]

Initialize positions.

run()[source]

Run.

tryEval(b)[source]

Check evaluations.

class NiaPy.algorithms.basic.ParticleSwarmAlgorithm(D, NP, nFES, C1, C2, w, vMin, vMax, benchmark)[source]

Bases: object

Implementation of Particle Swarm Optimization algorithm.

Algorithm: Particle Swarm Optimization algorithm

Date: 2018

Authors: Lucija Brezočnik, Grega Vrbančič, and Iztok Fister Jr.

License: MIT

Reference paper:
Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.

__init__(self, NP, D, nFES, C1, C2, w, vMin, vMax, benchmark).

Arguments:

NP {integer} – population size

D {integer} – dimension of problem

nFES {integer} – number of function evaluations

C1 {decimal} – cognitive component

C2 {decimal} – social component

w {decimal} – inertia weight

vMin {decimal} – minimal velocity

vMax {decimal} – maximal velocity

benchmark {object} – benchmark implementation object

bounds(position)[source]

Keep it within bounds.

eval_true()[source]

Check evaluations.

init()[source]

Initialize positions.

move_particles()[source]

Move particles in search space.

run()[source]

Run.

NiaPy.algorithms.modified

Implementation of modified nature-inspired algorithms.

class NiaPy.algorithms.modified.HybridBatAlgorithm(D, NP, nFES, A, r, F, CR, Qmin, Qmax, benchmark)[source]

Bases: object

Implementation of Hybrid bat algorithm.

Algorithm: Hybrid bat algorithm

Date: 2018

Author: Grega Vrbancic

License: MIT

Reference paper:
Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniski vestnik, 2013. 1-7.

__init__(self, D, NP, nFES, A, r, Qmin, Qmax, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

A {decimal} – loudness

r {decimal} – pulse rate

Qmin {decimal} – minimum frequency

Qmax {decimal } – maximum frequency

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
best_bat()[source]

Find the best bat.

eval_true()[source]

Check evauations.

init_bat()[source]

Initialize population.

move_bat()[source]

Move bats in search space.

run()[source]

Run.

classmethod simplebounds(val, lower, upper)[source]

Keep it within bounds.

class NiaPy.algorithms.modified.SelfAdaptiveDifferentialEvolutionAlgorithm(D, NP, nFES, F, CR, Tao, benchmark)[source]

Bases: object

Implementation of Self-adaptive differential evolution algorithm.

Algorithm: Self-adaptive differential evolution algorithm

Date: 2018

Author: Uros Mlakar

License: MIT

Reference paper:
Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V. Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE transactions on evolutionary computation, 10(6), 646-657, 2006.

__init__(self, D, NP, nFES, F, CR, Tao, benchmark).

Arguments:

D {integer} – dimension of problem

NP {integer} – population size

nFES {integer} – number of function evaluations

F {decimal} – scaling factor

CR {decimal} – crossover rate

Tao {decimal}

benchmark {object} – benchmark implementation object

Raises:
TypeError – Raised when given benchmark function which does not exists.
evalPopulation()[source]

Evaluate population.

generationStep(Population)[source]

Implement main DE/jDE step.

initPopulation()[source]

Initialize population.

run()[source]

Run.

tryEval(v)[source]

Check evaluations.