Elephant Herding Optimization
Elephant Herding Optimization
Abstract—In this paper, a new kind of swarm-based optimizer (MVO) [48], dragonfly algorithm (DA) [49], grey
metaheuristic search method, called Elephant Herding wolf optimizer (GWO) [50, 51], wolf search algorithm (WSA)
Optimization (EHO), is proposed for solving optimization tasks. [52], among others.
The EHO method is inspired by the herding behavior of In general, wide elephants are social in nature and the
elephant group. In nature, the elephants belonging to different elephant group is composed of several clans. The elephants
clans live together under the leadership of a matriarch, and the belonging to different clans live together under the leadership
male elephants will leave their family group when they grow up. of a matriarch, and male elephants remain solitary and will
These two behaviors can be modelled into two following leave their family group while growing up. Inspired by the
operators: clan updating operator and separating operator. In
herding behavior of elephant group, a new kind of swarm-
EHO, the elephants in each clan are updated by its current
based heuristic search method, called EHO, is proposed for
position and matriarch through clan updating operator. It is
followed by the implementation of the separating operator
solving global optimization tasks. These habitation of
which can enhance the population diversity at the later search elephants can be used to solve optimization problems. The
phase. To demonstrate its effectiveness, EHO is benchmarked behavior of elephant herding in nature are idealized into clan
by fifteen test cases comparing with BBO, DE and GA. The updating operator and separating operator. In EHO, each
results show that EHO can find the better values on most elephant implements clan updating operator to update its
benchmark problems than those three metaheuristic algorithms. position based on its current position and matriarch in the
responding clan. Subsequently, the worst elephant is replaced
Keywords- Elephant herding optimization; Benchmark by separating operator. By comparing with BBO [53-58], DE
functions; Global optimization; Swarm intelligence [59-64] and GA [65], the performance of EHO is investigated
by several experiments implemented on fifteen test cases. The
I. INTRODUCTION results show that EHO can find much fitter solutions on most
The ever increasing complexity of the real-world benchmark problems than the three other methods.
problems is making it extremely difficult for the traditional Section 2 reviews the herding behavior of elephants in
methods to address those. On the other hand, though modern nature. Section 3 discusses how the herding behavior of
metaheuristic methods cannot provide exact answers, they can elephants is to formulate a general-purpose heuristic search.
generate satisfactory solutions within a reasonable time span. Several simulation results comparing EHO with other
Over the past few years, various kinds of metaheuristic methods on fifteen functions, are presented in Section 4.
algorithms have been proposed and successfully applied to Section 5 concludes this paper.
solve myriads of real-world optimization problems. Among
all metaheuristic methods, swarm-based algorithms [1] are II. HERDING BEHAVIOR OF ELEPHANTS
one of the most representative paradigms & widely used ones. Elephants are one of the largest mammals on land. The
Swarm intelligence (SI) methods are one of the most well- African elephant and the Asian elephant are two of
known paradigms in metaheuristic methods which has been traditionally recognized species. A long trunk is the most
widely used in various applications. Its inspiration originates representative feature that are multipurpose, such as breathing,
from the collective behavior of animals. Two of the widely lifting water and grasping objects.
used SI algorithms are particle swarm optimization (PSO) [2- In nature, elephants are social animals, and they have
9] and ant colony optimization (ACO) [10]. The idea of PSO complex social structures of females and calves. An elephant
[2] originated from the social behavior of bird flocking. group is composed of several clans under the leadership of a
The ants, in nature, are well capable of keeping the past matriarch, often the oldest cow [66]. A clan consists of one
paths in mind by pheromone. Inspired by this phenomenon, female with her calves or certain related females. Females
ACO [10, 11] was proposed by Dorigo et al. Recently, more prefer to live in family groups, while male elephants tend to
effective swarm intelligence algorithms have been proposed, live in isolation, and they will leave their family group when
such as artificial bee colony (ABC) [12, 13], cuckoo search growing up. Though male elephants live away from their
(CS) [14-20], bat algorithm (BA) [21-24], firefly algorithm family group, they can stay in contact with elephants in their
(FA) [25-28], animal migration optimization (AMO) [29], ant clan through low-frequency vibrations [66].
lion optimizer (ALO) [30], big bang-big crunch algorithm In this paper, the herding behavior of the elephants is
(BB-BC) [31-34], charged system search (CSS) [35-37], considered as two operators, which are subsequently idealized
chaotic swarming of particles (CSP) [38], monarch butterfly to form a general-purpose global optimization method.
optimization (MBO) [39], krill herd (KH) [40-47], multi-verse
2
elephants, i.e., nci=10. For BBO, DE and GA, their parameter In general, all the metaheuristic methods are depended on
settings can be found in [69, 70]. certain stochastic distribution. Therefore, different runs will
generate different results. In this work, 100 independent runs
TABLE I. BENCHMARK FUNCTIONS are implemented in order to get the most representative
No. Name No. Name No. Name statistical results (see Tables II-III). In the following tables,
F01 Ackley F06 Griewank F11 Penalty #2
the fittest solution is highlighted in bold font.
In Table II, the “6.57±0.95” indicates the mean and
F02 Alpine F07 Holzman 2 F12 Perm
standard deviation (Std) of function value are 6.57 and 0.95,
F03 Brown F08 Levy F13 Powell respectively. For the mean and Std values as shown in Table
F04 Dixon & Price F09 Pathological F14 Quartic II, EHO method has proven its best performance on F01-F04,
F05 Fletcher-Powell F10 Penalty #1 F15 Rastrigin F06-F08, F10, F11, F13, and F15. BBO has shown its best
performance on F05, F09 and F14. For F12, DE has the fittest
solutions, while GA has the smallest Std on this cases.
TABLE III. BEST FUNCTION VALUES OBTAINED BY FOUR METHODS. DE have the best solutions on F05, F08, F15, F20 and F09,
BBO DE EHO GA F12, F19, respectively. It should be noted that, for F03 and
F01 4.08 15.82 1.3E-3 13.76 F14 cases, both BBO and GA can find the same final function
F02 0.37 9.97 1.8E-4 12.69
values coincidentally.
Moreover, the convergent process of four methods on the
F03 2.2E-16 1.63 2.3E-6 2.2E-16
most representative benchmarks can be given as follows (see
F04 4.8E4 8.7E5 0.74 9.3E5
Figures 4-5). Figure 4 shows the convergent history of F02
F05 2.6E4 1.3E5 2.5E5 8.5E4 Alpine function. For this case, though BBO is able to find the
F06 3.52 13.09 1.00 10.27 final solution that is little worse than EHO, EHO can converge
F07 60.00 475.88 7.6E-14 1.6E3 to the optimal sharply within five generations. Figure 5 shows
F08 0.40 7.17 1.27 12.41 the convergent history of F03 Brown function. It is clear that,
F09 4.34 1.46 3.20 4.07 EHO has the better performance than BBO, DE and GA
F10 3.26 32.71 0.23 13.90 during the whole optimization process. BBO, DE and GA
F11 118.50 2.6E5 1.33 2.2E5 have the similar performance.
F12 6.0E51 5.8E37 4.0E45 6.0E51
F13 23.00 334.18 8.8E-7 191.00
F14 2.2E-16 0.18 8.4E-16 2.2E-16
F15 2.2E-16 79.00 1.8E-5 9.00
For the best solutions as shown in Table III, EHO has the
strong search ability and can find the fittest solution on F01,
F02, F04, F06, F07, F10, F11, F13, and F16-F18. BBO and
3
[4] S. Mirjalili, G.-G. Wang, and L. d. S. Coelho, “Binary optimization
using hybrid particle swarm optimization and gravitational search
algorithm,” Neural Computing and Applications, vol. 25, no. 6, pp.
1423-1435, 2014.
[5] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and S. Deb, “A hybrid
method based on krill herd and quantum-behaved particle swarm
optimization,” Neural Computing and Applications, 2015.
[6] G.-G. Wang, A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “A novel
improved accelerated particle swarm optimization algorithm for global
numerical optimization,” Engineering Computations, vol. 31, no. 7, pp.
1198-1220, 2014.
[7] X. Zhao, B. Song, P. Huang, Z. Wen, J. Weng, and Y. Fan, “An
improved discrete immune optimization algorithm based on PSO for
QoS-driven web service composition,” Applied Soft Computing, vol.
12, no. 8, pp. 2208-2216, 2012.
[8] X. Zhao, “A perturbed particle swarm algorithm for numerical
FIGURE 4. CONVERGENT CURVES OF THE F02 ALPINE FUNCTION optimization,” Applied Soft Computing, vol. 10, no. 1, pp. 119-124,
2010.
[9] S. Mirjalili, and A. Lewis, “S-shaped versus V-shaped transfer
functions for binary Particle Swarm Optimization,” Swarm and
Evolutionary Computation, vol. 9, pp. 1-14, 2013.
[10] M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization
by a colony of cooperating agents,” IEEE Transactions on Systems,
Man, and Cybernetics, Part B: Cybernetics, vol. 26, no. 1, pp. 29-41,
1996.
[11] K. Krynicki, J. Jaen, and J. A. Mocholí, “Ant colony optimisation for
resource searching in dynamic peer-to-peer grids,” International
Journal of Bio-Inspired Computation, vol. 6, no. 3, pp. 153-165, 2014.
[12] D. Karaboga, and B. Basturk, “A powerful and efficient algorithm for
numerical function optimization: artificial bee colony (ABC)
algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459-471,
2007.
[13] X. Li, and M. Yin, “Self-adaptive constrained artificial bee colony for
constrained numerical optimization,” Neural Computing and
FIGURE 5. CONVERGENT CURVES OF THE F03 BROWN FUNCTION Applications, vol. 24, no. 3-4, pp. 723-734, 2012.
[14] X. S. Yang, and S. Deb, "Cuckoo search via Lévy flights." pp. 210-214.
V. CONCLUSION [15] X.-S. Yang, and S. Deb, “Cuckoo search: recent advances and
applications,” Neural Computing and Applications, vol. 24, no. 1, pp.
In this paper, the behavior of elephant herding are
169-174, 2013.
idealized into clan updating operator and separating operator. [16] X. Li, J. Wang, and M. Yin, “Enhancing the performance of cuckoo
Through modelling the behavior of elephant herding in nature, search algorithm using orthogonal learning method,” Neural
a new kind of swarm-based heuristic search method, called Computing and Applications, vol. 24, no. 6, pp. 1233-1247, 2013.
EHO, is proposed for solving global optimization tasks. At the [17] G.-G. Wang, A. H. Gandomi, X. Zhao, and H. C. E. Chu, “Hybridizing
early phase of EHO, each elephant in clan is updated by using harmony search algorithm with cuckoo search for global numerical
optimization,” Soft Computing, 2014.
clan information through clan updating operator. And then, [18] G.-G. Wang, S. Deb, A. H. Gandomi, Z. Zhang, and A. H. Alavi,
the worst elephant is replaced by randomly generated elephant “Chaotic cuckoo search,” Soft Computing, 2015.
individual through separating operator. By comparing with [19] G.-G. Wang, A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “A new
BBO, DE and GA, EHO is benchmarked by fifteen test cases. hybrid method based on krill herd and cuckoo search for global
EHO can find much better solutions on most benchmark optimization tasks,” International Journal of Bio-Inspired
Computation, 2012.
problems than three other algorithms. [20] X. Li, and M. Yin, “Modified cuckoo search algorithm with self
adaptive parameter method,” Information Sciences, vol. 298, pp. 80-97,
ACKNOWLEDGMENT 2015.
This work was supported by Jiangsu Province Science [21] X. S. Yang, Nature-inspired metaheuristic algorithms, 2nd ed.,
Foundation for Youths (No. BK20150239) and National Luniver Press, Frome, 2010.
[22] S. Mirjalili, S. M. Mirjalili, and X.-S. Yang, “Binary bat algorithm,”
Natural Science Foundation of China (No. 61503165). Neural Computing and Applications, vol. 25, no. 3-4, pp. 663-681,
2013.
REFERENCES [23] J.-W. Zhang, and G.-G. Wang, “Image matching using a bat algorithm
[1] Z. Cui, and X. Gao, “Theory and applications of swarm intelligence,” with mutation,” Applied Mechanics and Materials, vol. 203, no. 1, pp.
Neural Computing & Applications, vol. 21, no. 2, pp. 205-206, 2012. 88-93, 2012.
[2] J. Kennedy, and R. Eberhart, “Particle swarm optimization,” in [24] G.-G. Wang, B. Chang, and Z. Zhang, "A multi-swarm bat algorithm
Proceeding of the IEEE International Conference on Neural Networks, for global optimization." pp. 480-485.
Perth, Australia, 1995, pp. 1942-1948. [25] A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Mixed variable
[3] G. Ram, D. Mandal, R. Kar, and S. P. Ghoshal, “Optimal design of structural optimization using firefly algorithm,” Computers &
non-uniform circular antenna arrays using PSO with wavelet mutation,” Structures, vol. 89, no. 23-24, pp. 2325-2336, 2011.
International Journal of Bio-Inspired Computation, vol. 6, no. 6, pp. [26] X. S. Yang, “Firefly algorithm, stochastic test functions and design
424-433, 2014. optimisation,” International Journal of Bio-Inspired Computation, vol.
2, no. 2, pp. 78-84, 2010.
4
[27] G.-G. Wang, L. Guo, H. Duan, and H. Wang, “A new improved firefly [49] S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic optimization
algorithm for global numerical optimization,” Journal of technique for solving single-objective, discrete, and multi-objective
Computational and Theoretical Nanoscience, vol. 11, no. 2, pp. 477- problems,” Neural Computing and Applications, 2015.
485, 2014. [50] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,”
[28] L. Guo, G.-G. Wang, H. Wang, and D. Wang, “An effective hybrid Advances in Engineering Software, vol. 69, pp. 46-61, 2014.
firefly algorithm with harmony search for global numerical [51] S. Saremi, S. Z. Mirjalili, and S. M. Mirjalili, “Evolutionary population
optimization,” The Scientific World Journal, vol. 2013, pp. 1-10, 2013. dynamics and grey wolf optimizer,” Neural Computing and
[29] X. Li, J. Zhang, and M. Yin, “Animal migration optimization: an Applications, vol. 26, no. 5, pp. 1257-1263, 2014.
optimization algorithm inspired by animal migration behavior,” Neural [52] S. Fong, S. Deb, and X.-S. Yang, “A heuristic optimization method
Computing and Applications, vol. 24, no. 7-8, pp. 1867-1877, 2014. inspired by wolf preying behavior,” Neural Computing and
[30] S. Mirjalili, “The ant lion optimizer,” Advances in Engineering Applications, 2015.
Software, vol. 83, pp. 80-98, 2015. [53] D. Simon, “Biogeography-based optimization,” IEEE Transactions on
[31] O. K. Erol, and I. Eksin, “A new optimization method: Big Bang-Big Evolutionary Computation, vol. 12, no. 6, pp. 702-713, 2008.
Crunch,” Advances in Engineering Software, vol. 37, no. 2, pp. 106- [54] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Let a biogeography-based
111, 2006. optimizer train your Multi-Layer Perceptron,” Information Sciences,
[32] A. Kaveh, and S. Talatahari, “Optimal design of Schwedler and ribbed vol. 269, pp. 188-209, 2014.
domes via hybrid Big Bang-Big Crunch algorithm,” Journal of [55] X. Li, and M. Yin, “Multi-operator based biogeography based
Constructional Steel Research, vol. 66, no. 3, pp. 412-419, 2010. optimization with mutation for global numerical optimization,”
[33] A. Kaveh, and S. Talatahari, “A discrete big bang-big crunch algorithm Computers & Mathematics with Applications, vol. 64, no. 9, pp. 2833-
for optimal design of skeletal structures,” Asian Journal of Civil 2844, 2012.
Engineering, vol. 11, no. 1, pp. 103-122, 2010. [56] S. Saremi, S. Mirjalili, and A. Lewis, “Biogeography-based
[34] A. Kaveh, and S. Talatahari, “Size optimization of space trusses using optimisation with chaos,” Neural Computing and Applications, vol. 25,
Big Bang–Big Crunch algorithm,” Computers & Structures, vol. 87, no. 5, pp. 1077-1097, 2014.
no. 17-18, pp. 1129-1140, 2009. [57] X.-T. Li, and M.-H. Yin, “Parameter estimation for chaotic systems
[35] A. Kaveh, and S. Talatahari, “A novel heuristic optimization method: using the cuckoo search algorithm with an orthogonal learning method,”
charged system search,” Acta Mechanica, vol. 213, no. 3-4, pp. 267- Chinese Physics B, vol. 21, no. 5, pp. 050507, 2012.
289, 2010. [58] X. Li, and M. Yin, “Multiobjective binary biogeography based
[36] S. Talatahari, and R. Sheikholeslami, “Optimum design of gravity and optimization for feature selection using gene expression data,” IEEE
reinforced retaining walls using enhanced charged system search Transactions on NanoBioscience, vol. 12, no. 4, pp. 343-353, 2013.
algorithm,” KSCE Journal of Civil Engineering, vol. 18, no. 5, pp. [59] R. Storn, and K. Price, “Differential evolution-a simple and efficient
1464-1469, 2014. heuristic for global optimization over continuous spaces,” Journal of
[37] A. Kaveh, and S. Talatahari, “Charged system search for optimal Global Optimization, vol. 11, no. 4, pp. 341-359, 1997.
design of frame structures,” Applied Soft Computing, vol. 12, no. 1, pp. [60] D. Zou, H. Liu, L. Gao, and S. Li, “An improved differential evolution
382-393, 2012. algorithm for the task assignment problem,” Engineering Applications
[38] A. Kaveh, R. Sheikholeslami, S. Talatahari, and M. Keshvari-Ilkhichi, of Artificial Intelligence, vol. 24, no. 4, pp. 616-624, 2011.
“Chaotic swarming of particles: a new method for size optimization of [61] D. Zou, J. Wu, L. Gao, and S. Li, “A modified differential evolution
truss structures,” Advances in Engineering Software, vol. 67, pp. 136- algorithm for unconstrained optimization problems,” Neurocomputing,
147, 2014. vol. 120, pp. 469-481, 2013.
[39] G.-G. Wang, S. Deb, and Z. Cui, “Monarch butterfly optimization,” [62] D.-x. Zou, L.-q. Gao, and S. Li, “Volterra filter modeling of a nonlinear
Neural Computing and Applications, 2015. discrete-time system based on a ranked differential evolution
[40] A. H. Gandomi, and A. H. Alavi, “Krill herd: a new bio-inspired algorithm,” Journal of Zhejiang University SCIENCE C, vol. 15, no. 8,
optimization algorithm,” Communications in Nonlinear Science and pp. 687-696, 2014.
Numerical Simulation, vol. 17, no. 12, pp. 4831-4845, 2012. [63] X. Li, and M. Yin, “An opposition-based differential evolution
[41] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “A chaotic particle- algorithm for permutation flow shop scheduling based on diversity
swarm krill herd algorithm for global numerical optimization,” measure,” Advances in Engineering Software, vol. 55, pp. 10-31, 2013.
Kybernetes, vol. 42, no. 6, pp. 962-978, 2013. [64] X. Li, and M. Yin, “Modified differential evolution with self-adaptive
[42] A. H. Gandomi, S. Talatahari, F. Tadbiri, and A. H. Alavi, “Krill herd parameters method,” Journal of Combinatorial Optimization, 2014.
algorithm for optimum design of truss structures,” International [65] D. E. Goldberg, Genetic Algorithms in Search, Optimization and
Journal of Bio-Inspired Computation, vol. 5, no. 5, pp. 281-288, 2013. Machine learning, Addison-Wesley, New York, 1998.
[43] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and G.-S. Hao, “Hybrid krill [66] R. Sukumar, The Asian elephant: ecology and management,
herd algorithm with differential evolution for global numerical Cambridge University Press, New York, 1993.
optimization,” Neural Computing and Applications, vol. 25, no. 2, pp. [67] X.-S. Yang, Z. Cui, R. Xiao, A. H. Gandomi, and M. Karamanoglu,
297-308, 2014. Swarm Intelligence and Bio-Inspired Computation, Elsevier, Waltham,
[44] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “An effective krill herd MA, 2013.
algorithm with migration operator in biogeography-based [68] P. Suganthan, N. Hansen, J. Liang, K. Deb, Y. Chen, A. Auger, and S.
optimization,” Applied Mathematical Modelling, vol. 38, no. 9-10, pp. Tiwari, Problem definitions and evaluation criteria for the CEC 2005
2454-2462, 2014. special session on real-parameter optimization, Nanyang
[45] G.-G. Wang, A. H. Gandomi, and A. H. Alavi, “Stud krill herd Technological University, Singapore, 2005.
algorithm,” Neurocomputing, vol. 128, pp. 363-370, 2014. [69] G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, and J. Li, “Incorporating
[46] G.-G. Wang, A. H. Gandomi, A. H. Alavi, and S. Deb, “A Multi-Stage mutation scheme into krill herd algorithm for global numerical
Krill Herd Algorithm for Global Numerical Optimization,” optimization,” Neural Computing and Applications, vol. 24, no. 3-4,
International Journal on Artificial Intelligence Tools, 2015. pp. 853-871, 2014.
[47] L. Guo, G.-G. Wang, A. H. Gandomi, A. H. Alavi, and H. Duan, “A [70] G.-G. Wang, L. Guo, A. H. Gandomi, G.-S. Hao, and H. Wang,
new improved krill herd algorithm for global numerical optimization,” “Chaotic krill herd algorithm,” Information Sciences, vol. 274, pp. 17-
Neurocomputing, vol. 138, pp. 392-402, 2014. 34, 2014.
[48] S. Mirjalili, S. M. Mirjalili, and A. Hatamlou, “Multi-verse optimizer:
a nature-inspired algorithm for global optimization,” Neural
Computing and Applications, 2015.