A Novel Hybrid Firefly Algorithm For Global Optimi
A Novel Hybrid Firefly Algorithm For Global Optimi
a11111 Abstract
Global optimization is challenging to solve due to its nonlinearity and multimodality. Tradi-
tional algorithms such as the gradient-based methods often struggle to deal with such prob-
lems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel
hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA),
OPEN ACCESS
is proposed by combining the advantages of both the firefly algorithm (FA) and differential
evolution (DE). FA and DE are executed in parallel to promote information sharing among
Citation: Zhang L, Liu L, Yang X-S, Dai Y (2016) A
Novel Hybrid Firefly Algorithm for Global the population and thus enhance searching efficiency. In order to evaluate the performance
Optimization. PLoS ONE 11(9): e0163230. and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are
doi:10.1371/journal.pone.0163230 employed and these functions fall into two groups: unimodal and multimodal. The experi-
Editor: Wen-Bo Du, Beihang University, CHINA mental results show better performance of the proposed algorithm compared to the original
Received: June 22, 2016 version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimiza-
tion (PSO) in the sense of avoiding local minima and increasing the convergence rate.
Accepted: September 6, 2016
algorithm to escape from the local optima to search more regions on a global scale. This kind
of strategy always produce unrepeatable routes of each individual run even starting with the
same initial points. Though may be slightly different, the final results of these algorithms can
often converge to the same optimal results within a given criterion if the algorithm is allowed
to run long enough [8].
Nowadays, most stochastic algorithms can be called meta-heuristic algorithms [12]. Most of
them have been developed, based on the biological processes in nature and these algorithms
start to show their power and efficiency. Genetic Algorithm (GA) [13], Ant Colony Optimiza-
tion (ACO) [14], Particle Swarm Optimization (PSO) [15–18], Artificial Bee Colony (ABC)
[19], Cuckoo Search (CS) [20] and Firefly Algorithm (FA) [21–24] are some of the most popu-
lar algorithms in this class of stochastic algorithms. The disadvantages of these algorithms are
the need for proper setting the algorithm-dependent parameters and a large number of itera-
tions. However, these meta-heuristic algorithms have two main advantages. One is the good
information-sharing mechanism which can promote the algorithm to converge faster under
certain conditions and the other is the lower probability of entrapment into local modes.
The paper is organized as follows: the main idea of the standard firefly algorithm and stan-
dard differential evolution are illustrated in Section 2, and then the details of our proposed
hybrid firefly algorithm are described in Section 3. In Section 4, we will demonstrate and carry
out the analysis of the experimental results. Finally, Section 5 concludes the work.
intensity I varies with the distance r and light absorption parameter γ exponentially and mono-
tonically [31]. That is
gr2
I ¼ I0 e ð1Þ
where I0 is the original light intensity at the source (i.e., at the distance r = 0) and γ is the light
absorption coefficient. From the idealized rules we known that in our simulation we suppose
the attractiveness of firefly is proportional to the light intensity I. So we can define the firefly’s
light attractive coefficient β in the similar way as the light intensity coefficient I. That is
gr2
b ¼ b0 e ð2Þ
where d is the number of dimensions. The amount of movement of firefly i to another more
attractive (brighter) firefly j is determined by
gr2
xi ¼ xi þ b0 e ðxj xi Þ þ aεi ð4Þ
where the first term is the current location of firefly i, the second term is due to the attraction,
while the third term is randomization with the vector of random variables εi being drawn from
different distributions such as the Uniform distribution, Gaussian distribution and Lévy flight.
In the third term, α is a scaling parameter that controls the step size and it should be linked
with the interests of the problems.
According to above idealization and approximations rules, the pseudo-code of standard
firefly algorithm can be summarized in Algorithm 1.
Algorithm 1 Pseudo-code for the standard FA algorithm
Objective function f(x), x = (x1, ,xD)T
Initialize a population of fireflies xi (i = 1,2, n)
Calculate the light intensity Ii at xi by f(xi)
Define light absorption coefficient γ
While (t < MaxGeneration)
for i = 1:n all n fireflies
for j = 1:n all n fireflies
Calculate the distance r between xi and xj using Cartesian distance
equation
if (Ij > Ii)
2
Attractiveness varies with distance r via b0 e gr
Move firefly i towards j in all d dimensions
end if
Evaluate new solutions and update light intensity
end for j
end for i
Rank the fireflies and find the current best
end while
Post-process results and visualization
the randomly generated, initial starting points to the potentially optimal solution. There are
many DE variants. In this paper, we use the so-called DE/rand/1/bin scheme/variant. This vari-
ant is probably the most widely used in practice, which can be briefly described as follows [32].
For a given D-dimensional minimization problem, a population consists of n individual
solution vectors. The mutant vector vi can be defined as follows:
where the indexes r1,r2, r3 2 [1, n] correspond to three solutions randomly chosen from the
whole population and g is the iteration/generation index. The indices have to be different from
each other. In addition, F (F 2 [0,2]) is a perturbation parameter that controls the amplification
of the difference vector xr2 ;g xr3 ;g , though in most cases 0 < F < 1 is used in practice.
The binomial crossover operation tries to produce a new trial vector from the perturbed or
mutated vector vi,g+1 = [vi1,g+1,vi2,g+1, ,viD,g+1] and the target vector xi,g = [xi1,g,xi2,g, ,xiD,g]
(
vij;gþ1; if rðjÞ Cr or j ¼ randomðiÞ
ui;gþ1 ¼ ð6Þ
xij;g; if rðjÞ > C or j 6¼ randomðiÞ
r
where j 2 [1,2, D], r(j) is the jth realization of a uniform random generator number. In addi-
tion, Cr 2 [0,1] is the so-called crossover constant. Here, random 2 [1,2, ,D] is a random per-
mutation index vector, which can usually ensure that the trial vector ui,g+1 gets at least one
character from the mutated vector vi,g+1.
The selection mechanism is similar to those of other algorithms where a greedy acceptance
is performed:
(
ui;gþ1 ; if f ðui;gþ1 Þ f ðxi;g Þ
xi;gþ1 ¼ ð7Þ
xi;g ; otherwise:
This means that the update is accepted only if a better objective is achieved.
Algorithm 2 summarizes the basic steps of the standard differential evolution algorithm.
Algorithm 2 Pseudo code for the standard DE algorithm
Initialize the population xi (i = 1,2, n) from the randomly initial starting
points
Set the perturbation parameter F and crossover probability parameter Cr
While (t < MaxGeneration)
for i = 1:n in all individuals
For each xi, randomly choose 3 different vectors xr1 , xr2 and xr3 from the
whole population
Use mutation to generate a new vector vi
Generate a random index random (i)
Generate a randomly distributed number r(j) [0,1]
for j = 1:D
Crossover
8 operation, for each parameter vij, update
< vij;gþ1; if rðjÞ Cr or j ¼ randomðiÞ
ui;gþ1 ¼
: xij;g;
if rðjÞ > Cr or j 6¼ randomðiÞ
end for j
Select operation, select and update the solution xi
end for i
end while
Post-process results and visualization
Mix the two groups and regroup them randomly into new groups: G1 and
G2
Evaluate the fitness value of each particle
Until a terminate-conditionis met
End
Post-process results and visualization
Though the detailed computational complexity may depend on the structure of the imple-
mentation, however, for three meta-heuristic algorithms used in this paper, their complexities
can be easily estimated. For FA, the time complexity is O(n2t) where n is the population size
and t is the number of iterations because there are two loops for going through the population.
For DE, its complexity is O(nt). Therefore, in this case, for our proposed hybrid approach
(HFA), the time complexity is O(n2t/4 + nt/2) because each component (either FA or DE)
only uses half of the population. As n is small (in this case, n = 20 or 40), and t is large (in this
case, t = 2000), the computation cost is relatively inexpensive because the algorithm complexity
is linear in terms of t. The main computational cost will be in the evaluations of objective
functions.
Schwefel’s 2.22 X
D Y
D 30 [−10,10]D 0
f2 ðxÞ ¼ jxi j þ jxi j
i¼1 i¼1
!2
Schwefel’s 1.20 X
D X
D 30 [−100,100]D 0
f3 ðxÞ ¼ xj
i¼1 j¼1
Step X
D
2
30 [−100,100]D 0
f6 ðxÞ ¼ ðxi þ 0:5Þ
i¼1
Quartic Noise X
n
30 [−1.28,1.28]D 0
f7 ðxÞ ¼ ixi4 þ random½0; 1Þ
i¼1
doi:10.1371/journal.pone.0163230.t001
Rastrigin X
D 30 [−5.12,5.12]D 0
f9 ðxÞ ¼ ½xi2 10cosð2pxi þ 10Þ
i¼1
sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi!
Ackley 1X n 30 [−32,32]D 0
f10 ðXÞ ¼ 20exp 0:2 x2
n i¼1 i
!
1X n
exp cosð2pxi Þ þ 20 þ e
n i¼1
Griewank X
n Y
n 30 [−600,600]D 0
1
f11 ðXÞ ¼ 4000 xi2 cos pxiffii þ 1
i¼1 i¼1
Pendlized X
D 30 [−50,50]D 0
f12 ðxÞ ¼ uðxi ; 10; 100; 4Þ
i¼1
( )
p X
D 1
2 2
þ 10sin2 ð3pyi Þ þ ðyi 1Þ ½1 þ sin2 ð3pyiþ1 Þ þ ðyD 1Þ
D i¼1
yi ¼ 1 þ 14 ðxi þ 1Þ
8
>
> m xi > a;
> kðxi 1Þ ;
>
<
uðxi ; a; k; mÞ ¼ 0; a xi a;
>
>
>
>
: kð x 1Þm ; i
xi < a;
Generalized X
D 30 [−50,50]D 0
Pendlized f13 ðxÞ ¼ uðxi ; 5; 10; 4Þ
i¼1
( )
1 X
D 1
2 2
þ sin2 ð3px1 Þ þ ðxi 2
1Þ ½1 þ sin ð3pxiþ1 Þ þ ðxD 1Þ ½1 þ sin ð2pxD Þ 2
10 i¼1
8
>
> m xi > a;
>
> kðxi 1Þ ;
<
uðxi ; a; k; mÞ ¼ 0; a xi a;
>
>
>
> m
: kð xi 1Þ ;
xi < a;
doi:10.1371/journal.pone.0163230.t002
The test benchmark functions can be divided into two groups in terms of the number of
local minima: unimodal functions and multimodal functions. The unimodal test functions
have one global optimum, so they are suitable for benchmarking the local exploitation ability
of algorithms. This kind of functions will allow to focus more on the convergence rates of the
tested algorithms other than the final results. Multimodal test functions have many local min-
ima, and the number of local optima usually increases exponentially with the problem dimen-
sion, so they are suitable for benchmarking the global exploration ability of algorithms. This
kind of multimodal functions can test the exploration ability which can make the algorithm
escape from local optima. In some applications, to find a good optimal or suboptimal solution
is more important, while other applications may place the emphasis on the accuracy of the
solutions. So the quality of final results is more of concern in such applications.
From Table 1, we know that functions f1-f7 are unimodal, high-dimensional problems.
Function f5, also namely the ‘banana function’, has a global optimum inside a long but flat, nar-
row, parabola-shaped valley. To find the location of the valley is non-trivial, though not too dif-
ficult. However, to converge to the global minimum with a high accuracy is more difficult,
especially for gradient-based algorithms. Function f6 is the step function, characterized by pla-
teaus and discontinuities. In addition, function f7 is a noisy quadratic function.
Functions f8–f13 in Table 2 are multimodal, high-dimensional problems and more details
are summarized in Table 2. For example, f8 is a non-convex, multimodal and additively separa-
ble function. This seemingly simple function can be deceptive because the global minimum at
(420.9687, ,420.9687) is geometrically distant from the next best local minima in the domain
[−500,500]D where D is the number of dimensions. Therefore, many algorithms including
some of metaheuristic algorithms may find it quite challenging to solve. In addition, f9 is also
challenging as it is one of the most difficult benchmarks commonly used in the literature
because it has multiple, steep wells with multiple local minima. Another widely used multi-
modal benchmark function is f10, namely the Ackley function, which can be characterized by a
deep valley at the centre and an almost flat outer zone. Consequently, it is quite challenging to
solve because it is easy for most optimization algorithms to get trapped in one of its many local
minima due to the multimodality.
Parameter Settings
For the verification purpose of the algorithms and the analysis of the experimental results, our
proposed hybrid firefly algorithm is compared to the standard FA and DE as well as PSO to
benchmark the performance and to see if there is any improvement.
In all cases, the population size is set to 40, and the dimension of the benchmark functions
is equal to 30. We also set the maximum number of iterations, as the stopping criteria, equal to
2000. The initial population is generated using uniformly distributed random initialization
within the ranges or limits of the design variables. In addition, 30 independent runs have also
been carried out for each function and each algorithm with completely different initial settings.
The results from the algorithms are accompanied according to four standard statistical mea-
sures: the Minimum, the Maximum, the Mean, and the Standard Deviation (Std) of the fitness
values calculated over 30 independent runs.
For the firefly algorithm, we set the initial attractiveness β0 = 2 rand, the light absorption
coefficient γ = 1/S2 where S donates the average range of the variables, the random parameter α
(α = 0.2 0.95iter where 0.2 is the initial randomness factor and iter is the index of the iteration)
reduces monotonically and gradually. Finally, we use the Lévy distribution to draw the random
numbers because it can produce occasionally some long leaps [37]. The values of the differential
evolution algorithm-dependent parameters are F = 0.5 as the scaling factor and Cr = 0.9 as the
crossover constant [38]. Additionally, for particle swarm optimization, the learning factors c1 and
c2 are both set as 2, the inertia weight ω decreases linearly from ωmax = 0.9 to ωmin = 0.4 [39].
It is worth pointing out that in our proposed HFA, the parameters, β0, γ, α, εi, F and CR, are
all the same as those defined in the standard FA and DE. Specially, in our implementations, we
have divided the whole population into two subgroups (subpopulations), which means that the
population size in FA and DE each is equal to 20. And at the same time we have also divided
the total 2000 iterations into 10 sub-iteration groups (or subgroups or substages). For each
sub-iteration group, FA and DE, respectively, the number of sub-iterations is set to 200 times
in parallel, and thus the total of 2000 iterations is realized in 10 subgroups and each with a
number of 200 iterations.
All of the algorithm-dependent parameters are summarised in Table 3.
Table 5. The mean value of unimodal benchmark functions for HFA, FA, DE and PSO over 30 runs.
# Fnc HFA FA DE PSO
f1 2.64E-171 1.57E-87 1.4155e-08 9.43E-11
f2 2.46E-103 1.73E-44 6.0678e-04 3.66E-08
f3 0.7115 25.714 0.2789 459.69
f4 5.30E-57 1.68E-44 0.8832 6.5934
f5 0.077152 29.053 21.9994 49.686
f6 0 0 0 0
f7 0.000183 0.001582 0.0113 0.031475
doi:10.1371/journal.pone.0163230.t005
of all the relevant unimodal benchmark functions. The results of the Friedman non-parametric
test are illustrated in Table 6. According to the p-values in Table 6, we can conclude that HFA
has a significant difference from FA and PSO. However, the result become insignificant when
compared with DE.
Figs 1–3 are the convergence curves observed by the 4 mentioned algorithms for f1, f3 and f5.
In these figures, the horizontal axis is the number of iterations and the vertical axis is the fitness
value of the benchmark function. It is can be seen that HFA performs significantly better than
FA, DE, and PSO. For example, f1, namely the simple sphere function, is a famous benchmark
function. During the whole generations, HFA displays a faster convergence rate than those of
FA, DE, and PSO due to its better exploitation search ability. It is clear that HFA quickly
reaches the neighborhood of the global optimum and gets approximately 10−16 with only 200
iterations, while DE and PSO can only reach approximately 10−12 and 10−8, respectively after
the final 2000 iterations. In fact, HFA has a nearly constant convergence rate throughout the
whole iteration for most of the unimodal benchmark functions. And the experiment results of
HFA after 200 iterations (which means only one iterative repetition time) are better than the
final results of DE and PSO after 2000 iterations. Hence from Figs 1–3, we can say that our pro-
posed HFA has a quicker convergence rate and is able to improve its results steadily for a long
time. On the other hand, FA also maintains a fast convergence rate at the beginning, however,
it can get stuck into the local optimum very soon especially for Figs 2 and 3. Hence, we can
know that FA cannot prevent premature convergence due to the poor exploration ability, espe-
cially as the iterations proceed. From the observed convergence curves, it is clear that DE and
PSO have a very low convergence rate during the whole process compared with HFA and FA.
Multimodal Functions. For the second series of experiments, we use multimodal func-
tions to compare the exploration ability of the compared algorithms. The statistical results of
comparing the mentioned algorithms with 30 independent runs are presented in Table 7. The
best mean results of the mentioned algorithms are written in bold.
From the statistic results in Table 7 we can know that the HFA outperformed other com-
pared algorithms when solving the functions f8 and f9. The FA is the best for solving the func-
tions f10 and f11. In addition, HFA and FA have almost equal optimization abilities for solving
the functions f12 and f13. Both can obtain the accurate results of these functions.
Similar to what we have done for the unimodal test functions, the Friedman tests using the
significance level of 0.95 (or α = 0.05) are also conducted for all the multimodal benchmark
functions. Table 8 summarizes the mean values of the final results over 30 independent runs.
Fig 1. Comparison between PSO, DE, FA and HFA for the Sphere function.
doi:10.1371/journal.pone.0163230.g001
Fig 2. Comparison between PSO, DE, FA and HFA for Schwefel’s 1.20 function.
doi:10.1371/journal.pone.0163230.g002
Fig 3. Comparison between PSO, DE, FA and HFA for Rosenbrock’s function.
doi:10.1371/journal.pone.0163230.g003
The results of these tests are summarized in Table 9. The P-value in Table 9 shows that HFA
has a significant difference from DE, while the results become insignificant when compared
with FA and PSO.
At the same time, the convergence curves of different algorithms for f9 and f10 have been
shown in Figs 4 and 5 where the horizontal axis is the number of iterations and the vertical axis is
the fitness value of the benchmark function. According to Fig 4, DE and PSO perform poorly
during the whole iterative process. FA maintains a higher convergence rate, but unfortunately it
appears to become plunged into local optima after about 200 iterations. HFA can escape from
the local optima automatically and find the final global best. As can be seen in Fig 5, it is obvious
that FA and HFA perform significantly better than DE and PSO. In the beginning, FA displays a
faster convergence rate than HFA, while HFA overtakes FA finally. Thus we can say that for the
Ackley function both HFA and FA can maintain a strong exploration ability and robustness.
Conclusions
In this paper, we have proposed a novel hybrid firefly algorithm (HFA) by combining some of
the advantages of both firefly algorithm and differential evolution. Based on the theoretical
analysis and the problem solving ability of metaheuristic algorithms, we can summarize that
HFA has three advantages or improvements: the first strategy is equipped with a better balance
between exploration and exploitation due to the parallel use of FA and DE and the population
information-sharing. The experimental results illustrated that FA can provide an excellent con-
vergence rate and a strong exploration ability, whereas DE is good at exploitation by using
mutation and crossover operators. Ideally, an algorithm should explore the search space as
extensively as possible to find all the promising regions and simultaneously it should conduct a
more refined search in the promising areas so as to improve the precision of the solutions.
Table 8. The mean value of multimodal benchmark functions for HFA, FA, DE and PSO over 30 runs.
# Fnc HFA FA DE PSO
f8 -12439 -9469.5 -5016.3 -9020.8
f9 3.39E-08 9.3858 175.9112 30.15
f10 1.31E-05 1.25E-14 3.1272e-04 7.10E-06
f11 5.86E-09 0 0.0132 0.013444
f12 1.57E-32 1.57E-32 2.2768e-04 0.024188
f13 1.35E-32 1.35E-32 1.1956e-05 0.0032963
doi:10.1371/journal.pone.0163230.t008
The second improvement is that the selection mechanism used in the proposed approach
can enable the solution to converge to the optimum in a better way. This is achieved by first
mixing the two subpopulations that are independently evolved using either FA or DE, and then
selecting the best solutions among both subpopuations. Thus, it is more likely to find the global
optimum than each individual algorithm involved in the hybrid. The third strategy improve-
ment is that the hybrid can increase the diversity of solutions efficiently and can also help the
Fig 4. Comparison between PSO, DE, FA and HFA for Rastrigin’s function.
doi:10.1371/journal.pone.0163230.g004
Fig 5. Comparison between PSO, DE, FA and HFA for Ackley’s function.
doi:10.1371/journal.pone.0163230.g005
algorithm avoid the stagnation problem by using a mixing and regrouping mechanism. It can
be observed that the attraction operator in FA is a double-edged sword. To some extent, it can
accelerate the convergence speed, but may also mislead the algorithm to get stuck into some
local optima if the diversity of the population becomes low. Technically speaking, this hybrid
mechanism can liberate the population from sub-optimal solutions and enable a continued
progress toward the true global optima as have been observed in the simulations.
The statistical analyses have also confirmed the theoretical insight in this paper that the
three enhancements in the combined approach can explore and exploit the search space more
efficiently. It has been seen from the above results that the proposed HFA can indeed work well
compared to FA, DE and PSO, which has been further confirmed by the results obtained from
the Friedman tests.
Future work will explore different ways of mixing and regrouping the population so as to
enhance the performance even further. In addition, it will be useful to carry out a more detailed
parametric study to see how different sub-stages of iterations can be used to maximize the par-
allelism and also to reduce the overall number of iterations. Furthermore, it will also be useful
to automatically tune these parameters depending on the modality of the problem and thus
can solve problems more effectively in real-world applications.
Acknowledgments
This work has been supported financially by the National Natural Science Foundation of
China under the Grant 51109041, by the Fundamental Research Funds for the Central Univer-
sities under the Grant HEUCF160405 and also supported by the China Scholarship Council.
Author Contributions
Conceptualization: XSY LNZ.
Funding acquisition: YTD LNZ.
Methodology: LNZ XSY.
Writing – original draft: LNZ LQL.
Writing – review & editing: XSY LNZ YTD.
References
1. Horng M. H. Vector quantization using the firefly algorithm for image compression. Expert Systems
with Applications. vol. 39, no.1, pp. 1078–1091, 2012. doi: 10.1016/j.eswa.2011.07.108
2. Basu B., Mahanti G. K. Thinning of concentric two-ring circular array antenna using firefly algorithm.
Scientia Iranica. vol. 19, no.6, pp. 1802–1809, 2012. doi: 10.1016/j.scient.2012.06.030
3. Ourique C. O., Biscaia E. C., Pinto J. C. The use of particle swarm optimization for dynamical analysis
in chemical processes. Computers & Chemical Engineering. vol. 26, no.12, pp. 1783–1793, 2002. doi:
10.1016/s0098-1354(02)00153-9
4. Camilo T., Carret C., Silva J. S., Boavida F. An energy-efficient ant-based routing algorithm for wire-
less sensor networks, Ant Colony Optimization and Swarm Intelligence. Springer Berlin Heidelberg.
2006, pp. 49–59.
5. Reiner Horst, Pardalos Panos M. Eds. Handbook of global optimization. Vol. 2. Springer Science &
Business Media. 2013.
6. Kavousi-Fard A., Samet H., Marzbani F. A new hybrid modified firefly algorithm and support vector
regression model for accurate short term load forecasting. Expert systems with applications. vol. 41,
no. 13, pp. 6047–6056, 2014. doi: 10.1016/j.eswa.2014.03.053
32. Price K., Storn R. M., Lampinen J.A. Differential evolution: a practical approach to global optimization.
Springer Science & Business Media. 2006.
33. Blum C., Roli A. Metaheuristics in combinatorial optimization: Overview and conceptual comparison.
ACM Computing Surveys (CSUR). vol. 35, no. 3, pp. 268–308, 2003. doi: 10.1145/937503.937505
34. Yang X. S. Cuckoo search and firefly algorithm: overview and analysis, Cuckoo Search and Firefly
Algorithm. Springer International Publishing. 2014, pp.1–26.
35. Brest J., S Greiner, Bošković B., Mernik M., Zumer V. Self-adapting control parameters in differential
evolution: a comparative study on numerical benchmark problems. IEEE Trans. Evolutionary Compu-
tation. vol. 10, no. 6, pp. 646–657 2006. doi: 10.1109/tevc.2006.872133
36. Yao X., Liu Y., Lin G. Evolutionary programming made faster. IEEE Trans. Evolutionary Computation.
vol. 3, no. 2, pp. 82–102, 1999. doi: 10.1109/4235.771163
37. Brown C. T., Liebovitch L. S., Glendon R. Lévy flights in Dobe Ju/’hoansi foraging patterns. Human
Ecology. vol. 35 no. 1, pp. 129–138, 2007. doi: 10.1007/s10745-006-9083-4
38. Vesterstrom J., Thomsen R. A comparative study of differential evolution, particle swarm optimization,
and evolutionary algorithms on numerical benchmark problems. Evolutionary Computation. 2004.
CEC2004. Congress on. IEEE, 2004, 2, pp. 1980–1987. doi: 10.1109/cec.2004.1331139
39. Shi Y., Eberhart R. C. Parameter selection in particle swarm optimization. Evolutionary programming
VII. Springer Berlin Heidelberg. 1998, pp. 591–600.
40. Yang X. S. Free lunch or no free lunch: that is not just a question? International Journal on Artificial
Intelligence Tools. vol. 21, no. 03, 2012. doi: 10.1142/s0218213012400106
41. Wolpert D. H., Macready W. G. No free lunch theorems for optimization. IEEE Trans. Evolutionary
Computation. vol. 1, no. 1, pp. 67–82, 1997. doi: 10.1109/4235.585893
42. Fister I., Yang X. S., Brest J., Fister I Jr. On the randomized firefly algorithm. Cuckoo Search and Fire-
fly Algorithm. Springer International Publishing. 2014: 27–48.