0% found this document useful (0 votes)
84 views

Elephant Herding Optimization

Article about new algorithm of behavior the elephants

Uploaded by

Allan Ferrari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views

Elephant Herding Optimization

Article about new algorithm of behavior the elephants

Uploaded by

Allan Ferrari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2015 3rd International Symposium on Computational and Business Intelligence

Elephant Herding Optimization


Gai-Ge Wang* Suash Deb Leandro dos S. Coelho
School of Computer Science and Dept. of Computer Science & Industrial and Systems Engineering
Technology Engineering Graduate Program
Jiangsu Normal University Cambridge Institute of Technology Pontifical Catholic University of Parana
Xuzhou, China Ranchi, INDIA Curitiba, Brazil
E-mail: [email protected] Email: [email protected] Email: [email protected]

Abstract—In this paper, a new kind of swarm-based optimizer (MVO) [48], dragonfly algorithm (DA) [49], grey
metaheuristic search method, called Elephant Herding wolf optimizer (GWO) [50, 51], wolf search algorithm (WSA)
Optimization (EHO), is proposed for solving optimization tasks. [52], among others.
The EHO method is inspired by the herding behavior of In general, wide elephants are social in nature and the
elephant group. In nature, the elephants belonging to different elephant group is composed of several clans. The elephants
clans live together under the leadership of a matriarch, and the belonging to different clans live together under the leadership
male elephants will leave their family group when they grow up. of a matriarch, and male elephants remain solitary and will
These two behaviors can be modelled into two following leave their family group while growing up. Inspired by the
operators: clan updating operator and separating operator. In
herding behavior of elephant group, a new kind of swarm-
EHO, the elephants in each clan are updated by its current
based heuristic search method, called EHO, is proposed for
position and matriarch through clan updating operator. It is
followed by the implementation of the separating operator
solving global optimization tasks. These habitation of
which can enhance the population diversity at the later search elephants can be used to solve optimization problems. The
phase. To demonstrate its effectiveness, EHO is benchmarked behavior of elephant herding in nature are idealized into clan
by fifteen test cases comparing with BBO, DE and GA. The updating operator and separating operator. In EHO, each
results show that EHO can find the better values on most elephant implements clan updating operator to update its
benchmark problems than those three metaheuristic algorithms. position based on its current position and matriarch in the
responding clan. Subsequently, the worst elephant is replaced
Keywords- Elephant herding optimization; Benchmark by separating operator. By comparing with BBO [53-58], DE
functions; Global optimization; Swarm intelligence [59-64] and GA [65], the performance of EHO is investigated
by several experiments implemented on fifteen test cases. The
I. INTRODUCTION results show that EHO can find much fitter solutions on most
The ever increasing complexity of the real-world benchmark problems than the three other methods.
problems is making it extremely difficult for the traditional Section 2 reviews the herding behavior of elephants in
methods to address those. On the other hand, though modern nature. Section 3 discusses how the herding behavior of
metaheuristic methods cannot provide exact answers, they can elephants is to formulate a general-purpose heuristic search.
generate satisfactory solutions within a reasonable time span. Several simulation results comparing EHO with other
Over the past few years, various kinds of metaheuristic methods on fifteen functions, are presented in Section 4.
algorithms have been proposed and successfully applied to Section 5 concludes this paper.
solve myriads of real-world optimization problems. Among
all metaheuristic methods, swarm-based algorithms [1] are II. HERDING BEHAVIOR OF ELEPHANTS
one of the most representative paradigms & widely used ones. Elephants are one of the largest mammals on land. The
Swarm intelligence (SI) methods are one of the most well- African elephant and the Asian elephant are two of
known paradigms in metaheuristic methods which has been traditionally recognized species. A long trunk is the most
widely used in various applications. Its inspiration originates representative feature that are multipurpose, such as breathing,
from the collective behavior of animals. Two of the widely lifting water and grasping objects.
used SI algorithms are particle swarm optimization (PSO) [2- In nature, elephants are social animals, and they have
9] and ant colony optimization (ACO) [10]. The idea of PSO complex social structures of females and calves. An elephant
[2] originated from the social behavior of bird flocking. group is composed of several clans under the leadership of a
The ants, in nature, are well capable of keeping the past matriarch, often the oldest cow [66]. A clan consists of one
paths in mind by pheromone. Inspired by this phenomenon, female with her calves or certain related females. Females
ACO [10, 11] was proposed by Dorigo et al. Recently, more prefer to live in family groups, while male elephants tend to
effective swarm intelligence algorithms have been proposed, live in isolation, and they will leave their family group when
such as artificial bee colony (ABC) [12, 13], cuckoo search growing up. Though male elephants live away from their
(CS) [14-20], bat algorithm (BA) [21-24], firefly algorithm family group, they can stay in contact with elephants in their
(FA) [25-28], animal migration optimization (AMO) [29], ant clan through low-frequency vibrations [66].
lion optimizer (ALO) [30], big bang-big crunch algorithm In this paper, the herding behavior of the elephants is
(BB-BC) [31-34], charged system search (CSS) [35-37], considered as two operators, which are subsequently idealized
chaotic swarming of particles (CSP) [38], monarch butterfly to form a general-purpose global optimization method.
optimization (MBO) [39], krill herd (KH) [40-47], multi-verse

978-1-4673-8501-5/15 $31.00 © 2015 IEEE 1


DOI 10.1109/ISCBI.2015.8
III. ELEPHANT HERDING OPTIMIZATION Accordingly, the separating operator can be formed as
In order to make the herding behaviour of elephants solve shown in Figure 2.
all kinds of global optimization problems, we preferred to
simplify it into the following idealized rules. for ci=1 to nClan (for all clans in elephant population) do
1) The elephant population is composed of some clans,
and each clan has fixed number of elephants. for j=1 to nci (for all elephants in clan ci) do
2) A fixed number of male elephants will leave their
Update xci,j and generate xnew,ci,j by Eq. (1).
family group and live solitarily far away from the main
elephant group at each generation. if xci,j=xbest,ci then
3) The elephants in each clan live together under the
Update xci,j and generate xnew,ci,j by Eq. (2).
leadership of a matriarch.
end if
A. Clan updating operator
As mentioned before, all the elephants live together under end for j
the leadership of a matriarch in each clan. Therefore, for each end for ci
elephant in clan ci, its next position is influenced by matriarch
ci. For the elephant j in clan ci, it can be updated as Figure 1. Pseudo code of clan updating operator

xnew,ci , j  xci , j     xbest ,ci  xci , j   r (1)


where xnew,ci,j and xci,j are newly updated and old position for for ci=1 to nClan (all the clans in elephant population) do
elephant j in clan ci, respectively.  [0,1] is a scale factor Replace the worst elephant in clan ci by Eq. (4).
that determines the influence of matriarch ci on xci,j. xbest,ci
end for ci
represents matriarch ci, which is the fittest elephant individual
in clan ci. rę[0, 1]. Here, uniform distribution is used. Figure 2. Pseudo code of separating operator
The fittest elephant in each clan cannot be updated by Eq. Based on the description of clan updating operator and
(1), i.e., xci,j=xbest,ci. For the fittest one, it can be updated as separating operator, the EHO method is developed and its
xnew,ci, j 
 xcenter,ci (2) mainframe can be summarized as shown in Figure 3.
where
[0,1] is a factor that determines the influence of
the xcenter,ci on xnew,ci,j. We can see, the new individual xnew,ci,j in Step 1: Initialization. Set generation counter t=1; initialize
Eq. (2) is generated by the information obtained by all the
elephants in clan ci. xcenter,ci is the centre of clan ci, and for the the population; the maximum generation MaxGen.
d-th dimension it can be calculated as Step 2: While t<MaxGen do
1 nci
xcenter ,ci , d   xci , j , d (3)
Sort all the elephants according to their fitness.
nci j 1 Implement clan updating operator by Figure 1.
where 1≤d≤D indicates the d-th dimension, and D is its total Implement separating operator as shown in Figure 2.
dimension. nci is the number of elephants in clan ci. xci,j,d is the
d-th of the elephant individual xci,j. The centre of clan ci, Evaluate population by the newly updated positions.
xcenter,ci, can be calculated through D calculations by Eq. (3). t=t+1.
Based on the description above, the clan updating operator
can be formulated as shown in Figure 1. Step 3: end while

B. Separating operator Figure 3. Pseudo code of EHO algorithm


In elephant group, male elephants will leave their family IV. SIMULATION RESULTS
group and live alone when they reach puberty. This separating
process can be modelled into separating operator when In this section, EHO is verified by benchmark evaluation
solving optimization problems. In order to further improve the in comparison with three methods (BBO [53], DE [59] and
search ability of EHO method, let us assume that the elephant GA [65]) on fifteen test problems (see Table 1). F01-F15 are
individuals with the worst fitness will implement the basic benchmarks while F16-F20 are rotated, shifted, and
separating operator at each generation as shown in Eq. (4). composition functions selected from IEEE CEC 2005. More
xworst ,ci  xmin   xmax  xmin +1  rand
information about these test problems can be found in [53, 67,
(4)
68]. The dimension of F01-F20 is set to fifteen in this work.
where xmax and xmin are respectively upper and lower bound of In order to obtain fair results, all the implementations are
the position of elephant individual. xworst,ci is the worst conducted under the same conditions as shown in [69].
elephant individual in clan ci. rand ę [0, 1] is a kind of For four methods, both the population size and maximum
stochastic distribution and uniform distribution in the range [0, generations are set to fifty. The parameters in EHO are set as:
1] is used in our current work. the scale factor α=0.5, β=0.1, and the number of clan nClan=5.
In our current work, all the clans have the same number

2
elephants, i.e., nci=10. For BBO, DE and GA, their parameter In general, all the metaheuristic methods are depended on
settings can be found in [69, 70]. certain stochastic distribution. Therefore, different runs will
generate different results. In this work, 100 independent runs
TABLE I. BENCHMARK FUNCTIONS are implemented in order to get the most representative
No. Name No. Name No. Name statistical results (see Tables II-III). In the following tables,
F01 Ackley F06 Griewank F11 Penalty #2
the fittest solution is highlighted in bold font.
In Table II, the “6.57±0.95” indicates the mean and
F02 Alpine F07 Holzman 2 F12 Perm
standard deviation (Std) of function value are 6.57 and 0.95,
F03 Brown F08 Levy F13 Powell respectively. For the mean and Std values as shown in Table
F04 Dixon & Price F09 Pathological F14 Quartic II, EHO method has proven its best performance on F01-F04,
F05 Fletcher-Powell F10 Penalty #1 F15 Rastrigin F06-F08, F10, F11, F13, and F15. BBO has shown its best
performance on F05, F09 and F14. For F12, DE has the fittest
solutions, while GA has the smallest Std on this cases.

TABLE II. MEAN FUNCTION VALUES OBTAINED BY FOUR METHODS


BBO DE EHO GA
F01 6.57±0.95 18.30±0.47 1.7E-3±1.9E-4 16.69±0.80
F02 1.58±0.78 14.74±1.70 2.5E-4±3.6E-5 24.02±4.27
F03 0.50±1.00 2.90±0.65 2.9E-6±2.2E-7 9.89±4.59
F04 8.7E5±7.8E5 3.5E6±1.5E6 0.89±0.05 1.3E7±9.7E6
F05 8.9E4±2.6E4 2.4E5±5.1E4 6.4E5±1.9E5 2.1E5±8.6E4
F06 8.94±2.75 22.33±4.17 1.00±9.9E-8 44.82±19.61
F07 327.50±317.19 1.9E3±960.30 2.8E-13±1.4E-13 1.3E4±7.7E3
F08 2.65±1.16 15.27±3.84 1.73±0.15 34.71±11.65
F09 5.28±0.46 3.67±0.64 4.90±0.52 5.74±0.58
F10 2.5E4±7.1E4 3.2E5±4.1E5 0.45±0.11 3.1E6±5.3E6
F11 3.3E5±9.7E5 2.7E6±1.7E6 1.89±0.17 1.4E7±1.7E7
F12 6.7E51±2.2E51 5.5E45±1.4E46 7.5E49±2.0E50 6.0E51±1.3E37
F13 153.28±85.38 1.1E3±349.80 5.3E-6±1.9E-6 820.66±472.24
F14 2.2E-16±0.00 0.77±0.31 2.3E-15±7.5E-16 26.88±17.04
F15 2.18±1.45 115.82±14.78 3.6E-5±9.1E-6 47.40±11.29

TABLE III. BEST FUNCTION VALUES OBTAINED BY FOUR METHODS. DE have the best solutions on F05, F08, F15, F20 and F09,
BBO DE EHO GA F12, F19, respectively. It should be noted that, for F03 and
F01 4.08 15.82 1.3E-3 13.76 F14 cases, both BBO and GA can find the same final function
F02 0.37 9.97 1.8E-4 12.69
values coincidentally.
Moreover, the convergent process of four methods on the
F03 2.2E-16 1.63 2.3E-6 2.2E-16
most representative benchmarks can be given as follows (see
F04 4.8E4 8.7E5 0.74 9.3E5
Figures 4-5). Figure 4 shows the convergent history of F02
F05 2.6E4 1.3E5 2.5E5 8.5E4 Alpine function. For this case, though BBO is able to find the
F06 3.52 13.09 1.00 10.27 final solution that is little worse than EHO, EHO can converge
F07 60.00 475.88 7.6E-14 1.6E3 to the optimal sharply within five generations. Figure 5 shows
F08 0.40 7.17 1.27 12.41 the convergent history of F03 Brown function. It is clear that,
F09 4.34 1.46 3.20 4.07 EHO has the better performance than BBO, DE and GA
F10 3.26 32.71 0.23 13.90 during the whole optimization process. BBO, DE and GA
F11 118.50 2.6E5 1.33 2.2E5 have the similar performance.
F12 6.0E51 5.8E37 4.0E45 6.0E51
F13 23.00 334.18 8.8E-7 191.00
F14 2.2E-16 0.18 8.4E-16 2.2E-16
F15 2.2E-16 79.00 1.8E-5 9.00

For the best solutions as shown in Table III, EHO has the
strong search ability and can find the fittest solution on F01,
F02, F04, F06, F07, F10, F11, F13, and F16-F18. BBO and

3
[4] S. Mirjalili, G.-G. Wang, and L. d. S. Coelho, “Binary optimization
using hybrid particle swarm optimization and gravitational search
algorithm,” Neural Computing and Applications, vol. 25, no. 6, pp.
1423-1435, 2014.
[5] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and S. Deb, “A hybrid
method based on krill herd and quantum-behaved particle swarm
optimization,” Neural Computing and Applications, 2015.
[6] G.-G. Wang, A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “A novel
improved accelerated particle swarm optimization algorithm for global
numerical optimization,” Engineering Computations, vol. 31, no. 7, pp.
1198-1220, 2014.
[7] X. Zhao, B. Song, P. Huang, Z. Wen, J. Weng, and Y. Fan, “An
improved discrete immune optimization algorithm based on PSO for
QoS-driven web service composition,” Applied Soft Computing, vol.
12, no. 8, pp. 2208-2216, 2012.
[8] X. Zhao, “A perturbed particle swarm algorithm for numerical
FIGURE 4. CONVERGENT CURVES OF THE F02 ALPINE FUNCTION optimization,” Applied Soft Computing, vol. 10, no. 1, pp. 119-124,
2010.
[9] S. Mirjalili, and A. Lewis, “S-shaped versus V-shaped transfer
functions for binary Particle Swarm Optimization,” Swarm and
Evolutionary Computation, vol. 9, pp. 1-14, 2013.
[10] M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization
by a colony of cooperating agents,” IEEE Transactions on Systems,
Man, and Cybernetics, Part B: Cybernetics, vol. 26, no. 1, pp. 29-41,
1996.
[11] K. Krynicki, J. Jaen, and J. A. Mocholí, “Ant colony optimisation for
resource searching in dynamic peer-to-peer grids,” International
Journal of Bio-Inspired Computation, vol. 6, no. 3, pp. 153-165, 2014.
[12] D. Karaboga, and B. Basturk, “A powerful and efficient algorithm for
numerical function optimization: artificial bee colony (ABC)
algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459-471,
2007.
[13] X. Li, and M. Yin, “Self-adaptive constrained artificial bee colony for
constrained numerical optimization,” Neural Computing and
FIGURE 5. CONVERGENT CURVES OF THE F03 BROWN FUNCTION Applications, vol. 24, no. 3-4, pp. 723-734, 2012.
[14] X. S. Yang, and S. Deb, "Cuckoo search via Lévy flights." pp. 210-214.
V. CONCLUSION [15] X.-S. Yang, and S. Deb, “Cuckoo search: recent advances and
applications,” Neural Computing and Applications, vol. 24, no. 1, pp.
In this paper, the behavior of elephant herding are
169-174, 2013.
idealized into clan updating operator and separating operator. [16] X. Li, J. Wang, and M. Yin, “Enhancing the performance of cuckoo
Through modelling the behavior of elephant herding in nature, search algorithm using orthogonal learning method,” Neural
a new kind of swarm-based heuristic search method, called Computing and Applications, vol. 24, no. 6, pp. 1233-1247, 2013.
EHO, is proposed for solving global optimization tasks. At the [17] G.-G. Wang, A. H. Gandomi, X. Zhao, and H. C. E. Chu, “Hybridizing
early phase of EHO, each elephant in clan is updated by using harmony search algorithm with cuckoo search for global numerical
optimization,” Soft Computing, 2014.
clan information through clan updating operator. And then, [18] G.-G. Wang, S. Deb, A. H. Gandomi, Z. Zhang, and A. H. Alavi,
the worst elephant is replaced by randomly generated elephant “Chaotic cuckoo search,” Soft Computing, 2015.
individual through separating operator. By comparing with [19] G.-G. Wang, A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “A new
BBO, DE and GA, EHO is benchmarked by fifteen test cases. hybrid method based on krill herd and cuckoo search for global
EHO can find much better solutions on most benchmark optimization tasks,” International Journal of Bio-Inspired
Computation, 2012.
problems than three other algorithms. [20] X. Li, and M. Yin, “Modified cuckoo search algorithm with self
adaptive parameter method,” Information Sciences, vol. 298, pp. 80-97,
ACKNOWLEDGMENT 2015.
This work was supported by Jiangsu Province Science [21] X. S. Yang, Nature-inspired metaheuristic algorithms, 2nd ed.,
Foundation for Youths (No. BK20150239) and National Luniver Press, Frome, 2010.
[22] S. Mirjalili, S. M. Mirjalili, and X.-S. Yang, “Binary bat algorithm,”
Natural Science Foundation of China (No. 61503165). Neural Computing and Applications, vol. 25, no. 3-4, pp. 663-681,
2013.
REFERENCES [23] J.-W. Zhang, and G.-G. Wang, “Image matching using a bat algorithm
[1] Z. Cui, and X. Gao, “Theory and applications of swarm intelligence,” with mutation,” Applied Mechanics and Materials, vol. 203, no. 1, pp.
Neural Computing & Applications, vol. 21, no. 2, pp. 205-206, 2012. 88-93, 2012.
[2] J. Kennedy, and R. Eberhart, “Particle swarm optimization,” in [24] G.-G. Wang, B. Chang, and Z. Zhang, "A multi-swarm bat algorithm
Proceeding of the IEEE International Conference on Neural Networks, for global optimization." pp. 480-485.
Perth, Australia, 1995, pp. 1942-1948. [25] A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Mixed variable
[3] G. Ram, D. Mandal, R. Kar, and S. P. Ghoshal, “Optimal design of structural optimization using firefly algorithm,” Computers &
non-uniform circular antenna arrays using PSO with wavelet mutation,” Structures, vol. 89, no. 23-24, pp. 2325-2336, 2011.
International Journal of Bio-Inspired Computation, vol. 6, no. 6, pp. [26] X. S. Yang, “Firefly algorithm, stochastic test functions and design
424-433, 2014. optimisation,” International Journal of Bio-Inspired Computation, vol.
2, no. 2, pp. 78-84, 2010.

4
[27] G.-G. Wang, L. Guo, H. Duan, and H. Wang, “A new improved firefly [49] S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic optimization
algorithm for global numerical optimization,” Journal of technique for solving single-objective, discrete, and multi-objective
Computational and Theoretical Nanoscience, vol. 11, no. 2, pp. 477- problems,” Neural Computing and Applications, 2015.
485, 2014. [50] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,”
[28] L. Guo, G.-G. Wang, H. Wang, and D. Wang, “An effective hybrid Advances in Engineering Software, vol. 69, pp. 46-61, 2014.
firefly algorithm with harmony search for global numerical [51] S. Saremi, S. Z. Mirjalili, and S. M. Mirjalili, “Evolutionary population
optimization,” The Scientific World Journal, vol. 2013, pp. 1-10, 2013. dynamics and grey wolf optimizer,” Neural Computing and
[29] X. Li, J. Zhang, and M. Yin, “Animal migration optimization: an Applications, vol. 26, no. 5, pp. 1257-1263, 2014.
optimization algorithm inspired by animal migration behavior,” Neural [52] S. Fong, S. Deb, and X.-S. Yang, “A heuristic optimization method
Computing and Applications, vol. 24, no. 7-8, pp. 1867-1877, 2014. inspired by wolf preying behavior,” Neural Computing and
[30] S. Mirjalili, “The ant lion optimizer,” Advances in Engineering Applications, 2015.
Software, vol. 83, pp. 80-98, 2015. [53] D. Simon, “Biogeography-based optimization,” IEEE Transactions on
[31] O. K. Erol, and I. Eksin, “A new optimization method: Big Bang-Big Evolutionary Computation, vol. 12, no. 6, pp. 702-713, 2008.
Crunch,” Advances in Engineering Software, vol. 37, no. 2, pp. 106- [54] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Let a biogeography-based
111, 2006. optimizer train your Multi-Layer Perceptron,” Information Sciences,
[32] A. Kaveh, and S. Talatahari, “Optimal design of Schwedler and ribbed vol. 269, pp. 188-209, 2014.
domes via hybrid Big Bang-Big Crunch algorithm,” Journal of [55] X. Li, and M. Yin, “Multi-operator based biogeography based
Constructional Steel Research, vol. 66, no. 3, pp. 412-419, 2010. optimization with mutation for global numerical optimization,”
[33] A. Kaveh, and S. Talatahari, “A discrete big bang-big crunch algorithm Computers & Mathematics with Applications, vol. 64, no. 9, pp. 2833-
for optimal design of skeletal structures,” Asian Journal of Civil 2844, 2012.
Engineering, vol. 11, no. 1, pp. 103-122, 2010. [56] S. Saremi, S. Mirjalili, and A. Lewis, “Biogeography-based
[34] A. Kaveh, and S. Talatahari, “Size optimization of space trusses using optimisation with chaos,” Neural Computing and Applications, vol. 25,
Big Bang–Big Crunch algorithm,” Computers & Structures, vol. 87, no. 5, pp. 1077-1097, 2014.
no. 17-18, pp. 1129-1140, 2009. [57] X.-T. Li, and M.-H. Yin, “Parameter estimation for chaotic systems
[35] A. Kaveh, and S. Talatahari, “A novel heuristic optimization method: using the cuckoo search algorithm with an orthogonal learning method,”
charged system search,” Acta Mechanica, vol. 213, no. 3-4, pp. 267- Chinese Physics B, vol. 21, no. 5, pp. 050507, 2012.
289, 2010. [58] X. Li, and M. Yin, “Multiobjective binary biogeography based
[36] S. Talatahari, and R. Sheikholeslami, “Optimum design of gravity and optimization for feature selection using gene expression data,” IEEE
reinforced retaining walls using enhanced charged system search Transactions on NanoBioscience, vol. 12, no. 4, pp. 343-353, 2013.
algorithm,” KSCE Journal of Civil Engineering, vol. 18, no. 5, pp. [59] R. Storn, and K. Price, “Differential evolution-a simple and efficient
1464-1469, 2014. heuristic for global optimization over continuous spaces,” Journal of
[37] A. Kaveh, and S. Talatahari, “Charged system search for optimal Global Optimization, vol. 11, no. 4, pp. 341-359, 1997.
design of frame structures,” Applied Soft Computing, vol. 12, no. 1, pp. [60] D. Zou, H. Liu, L. Gao, and S. Li, “An improved differential evolution
382-393, 2012. algorithm for the task assignment problem,” Engineering Applications
[38] A. Kaveh, R. Sheikholeslami, S. Talatahari, and M. Keshvari-Ilkhichi, of Artificial Intelligence, vol. 24, no. 4, pp. 616-624, 2011.
“Chaotic swarming of particles: a new method for size optimization of [61] D. Zou, J. Wu, L. Gao, and S. Li, “A modified differential evolution
truss structures,” Advances in Engineering Software, vol. 67, pp. 136- algorithm for unconstrained optimization problems,” Neurocomputing,
147, 2014. vol. 120, pp. 469-481, 2013.
[39] G.-G. Wang, S. Deb, and Z. Cui, “Monarch butterfly optimization,” [62] D.-x. Zou, L.-q. Gao, and S. Li, “Volterra filter modeling of a nonlinear
Neural Computing and Applications, 2015. discrete-time system based on a ranked differential evolution
[40] A. H. Gandomi, and A. H. Alavi, “Krill herd: a new bio-inspired algorithm,” Journal of Zhejiang University SCIENCE C, vol. 15, no. 8,
optimization algorithm,” Communications in Nonlinear Science and pp. 687-696, 2014.
Numerical Simulation, vol. 17, no. 12, pp. 4831-4845, 2012. [63] X. Li, and M. Yin, “An opposition-based differential evolution
[41] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “A chaotic particle- algorithm for permutation flow shop scheduling based on diversity
swarm krill herd algorithm for global numerical optimization,” measure,” Advances in Engineering Software, vol. 55, pp. 10-31, 2013.
Kybernetes, vol. 42, no. 6, pp. 962-978, 2013. [64] X. Li, and M. Yin, “Modified differential evolution with self-adaptive
[42] A. H. Gandomi, S. Talatahari, F. Tadbiri, and A. H. Alavi, “Krill herd parameters method,” Journal of Combinatorial Optimization, 2014.
algorithm for optimum design of truss structures,” International [65] D. E. Goldberg, Genetic Algorithms in Search, Optimization and
Journal of Bio-Inspired Computation, vol. 5, no. 5, pp. 281-288, 2013. Machine learning, Addison-Wesley, New York, 1998.
[43] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and G.-S. Hao, “Hybrid krill [66] R. Sukumar, The Asian elephant: ecology and management,
herd algorithm with differential evolution for global numerical Cambridge University Press, New York, 1993.
optimization,” Neural Computing and Applications, vol. 25, no. 2, pp. [67] X.-S. Yang, Z. Cui, R. Xiao, A. H. Gandomi, and M. Karamanoglu,
297-308, 2014. Swarm Intelligence and Bio-Inspired Computation, Elsevier, Waltham,
[44] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “An effective krill herd MA, 2013.
algorithm with migration operator in biogeography-based [68] P. Suganthan, N. Hansen, J. Liang, K. Deb, Y. Chen, A. Auger, and S.
optimization,” Applied Mathematical Modelling, vol. 38, no. 9-10, pp. Tiwari, Problem definitions and evaluation criteria for the CEC 2005
2454-2462, 2014. special session on real-parameter optimization, Nanyang
[45] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “Stud krill herd Technological University, Singapore, 2005.
algorithm,” Neurocomputing, vol. 128, pp. 363-370, 2014. [69] G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, and J. Li, “Incorporating
[46] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and S. Deb, “A Multi-Stage mutation scheme into krill herd algorithm for global numerical
Krill Herd Algorithm for Global Numerical Optimization,” optimization,” Neural Computing and Applications, vol. 24, no. 3-4,
International Journal on Artificial Intelligence Tools, 2015. pp. 853-871, 2014.
[47] L. Guo, G.-G. Wang, A. H. Gandomi, A. H. Alavi, and H. Duan, “A [70] G.-G. Wang, L. Guo, A. H. Gandomi, G.-S. Hao, and H. Wang,
new improved krill herd algorithm for global numerical optimization,” “Chaotic krill herd algorithm,” Information Sciences, vol. 274, pp. 17-
Neurocomputing, vol. 138, pp. 392-402, 2014. 34, 2014.
[48] S. Mirjalili, S. M. Mirjalili, and A. Hatamlou, “Multi-verse optimizer:
a nature-inspired algorithm for global optimization,” Neural
Computing and Applications, 2015.

You might also like