0% found this document useful (0 votes)
34 views

Adaptive DE

Differential Evaluation algorithm

Uploaded by

Hafiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

Adaptive DE

Differential Evaluation algorithm

Uploaded by

Hafiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Hindawi Publishing Corporation

Journal of Control Science and Engineering


Volume 2013, Article ID 462706, 5 pages
http://dx.doi.org/10.1155/2013/462706

Research Article
An Improved Differential Evolution Algorithm Based on
Adaptive Parameter

Zhehuang Huang1,2 and Yidong Chen2,3


1
School of Mathematical Sciences, Huaqiao University, Quanzhou 362021, China
2
Cognitive Science Department, Xiamen University, Xiamen 361005, China
3
Fujian Key Laboratory of the Brain-Like Intelligent Systems, Xiamen 361005, China

Correspondence should be addressed to Yidong Chen; ydchen [email protected]

Received 2 July 2013; Revised 19 August 2013; Accepted 20 August 2013

Academic Editor: Xiaomei Qi

Copyright © 2013 Z. Huang and Y. Chen. This is an open access article distributed under the Creative Commons Attribution
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly
cited.

The differential evolution (DE) algorithm is a heuristic global optimization technique based on population which is easy to
understand, simple to implement, reliable, and fast. The evolutionary parameters directly influence the performance of differential
evolution algorithm. The adjustment of control parameters is a global behavior and has no general research theory to control
the parameters in the evolution process at present. In this paper, we propose an adaptive parameter adjustment method which
can dynamically adjust control parameters according to the evolution stage. The experiments on high dimensional function
optimization showed that the improved algorithm has more powerful global exploration ability and faster convergence speed.

1. Introduction and the large search space function optimization problems.


In order to improve the optimization performance of the
In recent years, intelligent optimization algorithms [1] are DE, many scholars have proposed many control parameters
considered as practical tools for nonlinear optimization methods [15, 16]. Although all the methods can improve the
problems. Differential evolution algorithm [2, 3] is a novel standard DE performance to some extent, they still cannot get
evolutionary algorithm on the basis of genetic algorithms satisfactory results for some of the functions. In this paper, we
first introduced by Storn and Price in 1997. The algorithm is propose an adaptive parameter adjustment method according
a bionic intelligent algorithm by simulation of natural bio- to the evolution stage.
logical evolution mechanism. Its main idea is to generate a This paper is organized as follows. Related work is
temporary individual based on individual differences within described in Section 2. In Section 3 the background of DE is
populations and then randomly restructure population evo- presented. The improved algorithm is presented in Section 4.
lutionary. The algorithm has better global convergence and In Section 5 some experimental tests, results, and conclusions
robustness, very suitable for solving a variety of numerical are given. Section 6 concludes the paper.
optimization problems, quickly making the algorithm a hot
topic in the current optimization field.
Because it is simple in principle and robust, DE has been 2. Related Work
applied successfully to all kinds of optimization problems
such as constrained global optimization [4], image classifi- The DE algorithm has a few parameters. These parameters
cation [5], neural network [6], linear array [7], monopoles have a great impact on the performance of the algorithm,
antenna [8], images segmentation [9], and other areas [10– such as the quality of the optimal value and convergence rate.
14]. There is still no good way to determine the parameters. In
However, DE algorithm can easily fall into local optimal order to deal with this problem, researchers have made some
solution in the course of the treatment of the multipeak attempts. Gamperle et al. [17] reported that it is more difficult
2 Journal of Control Science and Engineering

than expected to choose the control parameters of DE. Liu where


and Lampinen [18] reported that the performance of DE algo-
rithm is sensitive to the values of the parameters. Different 𝑥𝑗𝑖,𝐺+1
test functions should have different parameter settings.
At present, the parameter settings are mainly three ways: { 𝑉𝑗𝑖,𝐺+1 ,
{
{
(1) determined parameter setting method: the method is { if (rand𝑏 (𝑗) ≤ CR) or (𝑗 = 𝑚𝑏𝑟 (𝑖)) ,
={ (4)
mainly set by experience, for example, keeping fixed {𝑉𝑗𝑖,𝐺+1 ,
{
{
value throughout the entire evolutionary process; { if (rand𝑏 (𝑗) > CR) and (𝑗 ≠ 𝑚𝑏𝑟 (𝑖))
(2) adaptive parameter setting: some heuristic rules are
used to modify the parameter values accordingly to (𝑗 = 1, 2, . . . , 𝐷) ,
the current state;
where rand 𝑏(𝑗) is uniformly distributed in the interval [0, 1]
(3) self-adaptive parameter setting: the idea that “evolu-
and CR is crossover probability in the interval [0, 1]. 𝑚𝑏𝑟 (𝑖)
tion of the evolution” is used to implement the self-
means a random integer between [0, 𝐷].
adaptive parameter setting.
Liu and Lampinen [19] proposed a fuzzy adaptive param-
3.3. Selection Operation. Selection operation is greedy strat-
eter setting method which can change the parameters dynam-
egy; the candidate individual is generated from mutation and
ically. The experiment shows the convergence much faster
crossover operation competition with target individual:
than the traditional DE algorithm when adapting 𝐹 and CR.
In [20], self-adapting control parameters in DE are proposed;
the results show the improved algorithm is better than, or at {𝑈𝑖,𝐺, if (𝑓 (𝑈𝑖,𝐺) > 𝑓 (𝑥𝑖,𝐺+1 )) ,
least comparable to, the standard DE algorithm. 𝑥𝑖,𝐺+1 = { (5)
𝑥 , if (𝑓 (𝑈𝑖,𝐺) ≤ 𝑓 (𝑥𝑖,𝐺+1 )) ,
{ 𝑖,𝐺+1
3. Introduction to DE where 𝑓 is the fitness function.
Compared to other evolutionary algorithms, DE reserves The basic differential evolution (DE) algorithm is shown
population-based global search strategy and uses a sim- as Algorithm 1.
ple mutation operation of the differential and one-on-one
competition, so it can reduce the genetic complexity of the Algorithm 1 (the differential evolution algorithm). (1) Initial-
operation. At the same time, the specific memory capacity of ize the number of population NP, the maximum number of
DE enables it to dynamically track the current search to adjust evolution Maxinter, the scale factor and cross-factor.
their search strategy with a strong global convergence and (2) Initialize the population pop.
robustness. So it is suitable for solving some of the complex (3) Follow the DE/rand/1/bin policy enforcement options,
environments of the optimization problem. Basic operations and produce a new generation of individual:
such as selection, crossover, and mutation are the basis of the
difference algorithm. (a) mutation operation;
In an iterative process, the population of each generation (b) crossover operation;
𝐺 contains 𝑁 individuals. Suppose that the individual 𝑖 of
generation 𝐺 is represented as (c) selection operation.
1 2 𝐷
𝑋𝑖𝐺 = (𝑥𝑖𝐺 , 𝑥𝑖𝐺 , . . . , 𝑥𝑖𝐺 ), 𝑖 = 1, 2, . . . . (1) (4) Until the termination criterion is met.

3.1. Mutation Operation. An individual can be generated by The flow chart of differential evolution algorithm is
the following formula: shown in Figure 1.

𝑋𝑟1 ,𝐺+1 = 𝑋𝑟1 ,𝐺+1 + 𝐹 ∗ (𝑋𝑟2 ,𝐺 − 𝑋𝑟3 ,𝐺) . (2)


4. The Adaptive Control Parameter
Here 𝑟1 , 𝑟2 , and 𝑟3 are random numbers generated within Adjustment Method (ADE)
the interval [1, 𝑁] and variation factor 𝐹 is a real number of
the interval [0, 2]; it controls the amplification degree of the From standard DE algorithm, it is known that scale factor
differential variable 𝑋𝑟2 ,𝐺 − 𝑋𝑟3 ,𝐺. 𝐹 and cross-factor CR will not only affect convergence
speed of the algorithm, but may also lead to the occurrence
3.2. Crossover Operation. In difference algorithm, the cross- of premature phenomenon. In this paper, we propose an
operation is introduced to the diversity of the new popula- adaptive adjustment method according to the evolution stage.
tion. According to the crossover strategy, the old and new We use a sine function (1/4 cycle) with value of (−1, 0) and
individual exchange part of the code to form a new individual. a cosine function (1/4 cycle) with value of (0, 1). The image of
New individuals can be represented as follow: the two functions shows slower change at the beginning and
in the end, with rapid changes and gradual increase in the
𝑋𝑖,𝐺+1 = (𝑥1𝑖,𝐺+1 , 𝑥2𝑖,𝐺+1 , . . . , 𝑥𝐷𝑖,𝐺+1 ) , 𝑖 = 1, 2, . . . , (3) middle. It is very suitable for setting 𝐹 value and CR value.
Journal of Control Science and Engineering 3

Table 1: Functions used to test the effects of ADE.


Start
Function Function expression
𝑛
Sphere 𝑓1 (𝑥) = ∑𝑥𝑡2
Initialize function 𝑡=1
𝑛
Rastrigin 𝑓2 (𝑥) = ∑(𝑥𝑡2 − 10 cos (2𝜋𝑥𝑡 ) + 10)
Fitness evaluation function 𝑡=1
Griewank 1 𝑛 𝑛
𝑥 − 100
𝑓3 (𝑥) = ∑(𝑥𝑡 − 100)2 − ∏ cos ( 𝑡 )+1
function 4000 𝑡=1 𝑡=1 √𝑡
Mutation
Ackley 𝑛 2 )/𝑛 𝑛
𝑓4 (𝑥) = 20 + 𝑒 − 20𝑒−0.2√(∑𝑡=1 𝑥𝑡 − 𝑒(∑𝑡=1 cos(2𝜋𝑥𝑡 ))/𝑛
function
2
Crossover Shaffer’s (sin √𝑥12 + 𝑥22 ) − 0.5
function 𝑓5 (𝑥) = 0.5 + 2
(1 + 0.001 (𝑥12 + 𝑥22 ))
Selection
Table 2: The performances of DE and ADE.

Termination No DE ADE
Function
condition Optimal Time (s) Optimal Time (s)
Sphere 0.039 0.42 0.019 0.31
Yes Rastrigrin 19.07 0.43 4.75 0.36
Griewank 0.43 0.45 0.33 0.28
End Ackey 1.16 0.45 1.31 0.27
Shaffer 0.00973 0.43 0.00973 0.35
Figure 1: Flow chart of difference evaluation algorithm.

number of evolution Maxinter, scale factor 𝐹 and cross-factor


The early stage and the late stage of scale factor 𝐹 and cross- CR.
factor CR are relatively small, with relatively fast increase in (2) Initialize the population pop.
the middle, just to meet the global search of PE (3) Update the scaling factor 𝐹 of each individual accord-
𝜋𝑡 𝜋 ing to the above formula (6).
{
{ 𝛼 + (1 − 𝛼) × sin ( − ), (4) Update the cross-factor CR of each individual accord-
{
{ MAXITER 2
{
{ ing to the above formula (7).
{
{ MAXITER
{
{ if (𝑡 ≤ ), (5) Perform the following behavior: Mutation, Crossover
{
𝐹={ 2 (6) and Selection, and produce a new generation of individuals.
{
{ 𝜋 𝜋𝑡 (6) Until the termination criterion is met.
{
{
{𝛼 − (1 − 𝛼) × cos ( 2 − MAXITER ) ,
{
{
{
{
{
{ otherwise, 5. Experimental Results
𝜋𝑡 𝜋 A set of unconstrained real-valued benchmark functions
{
{ 𝛽 + (1 − 𝛽) × sin ( − ),
{
{ MAXITER 2 shown in Table 1 was used to investigate the effect of the
{
{
{
{ MAXITER improved algorithm.
{
{ if (𝑡 ≤ ),
{ The results are shown in Table 2. Each point is made from
CR = { 2 (7) average values of over 10 repetitions. We set scale factor 𝐹 =
{
{ 𝜋 𝜋𝑡
{
{ 0.6 and cross-factor CR = 0.5 for the standard PE algorithm
{𝛽 − (1 − 𝛽) × cos ( −
{ ),
{
{ 2 MAXITER and dynamically adjust 𝐹 and CR according to the evolution
{
{
otherwise, stage for the ADE algorithm.
{ From Table 2, we can see that no algorithm performs
where 𝛼 and 𝛽 are constants; for example, we can set 𝛼 = better than the others for all five functions, but on average,
0.8, and 𝛽 = 0.75 in the experiment. MAXITER is the the ADE is better than DE algorithm.
maximum number of iterations, and 𝑡 is the current number For Sphere function, Rastrigin function, and Griewank
of iterations. function, ADE algorithm can effectively improve the accu-
The procedure for implementing the APE is given by the racy such that the optimal value obtained is much closer
following steps. to the theoretical one compared with the standard DE
algorithm. Ackley function is a multimodal function; from
Algorithm 2 (the improved differential evolution algorithm). the results of iteration, the accuracy of the improved algo-
(1) Initialize the number of population NP, the maximum rithm is not as that of good as the standard DE algorithm,
4 Journal of Control Science and Engineering

Best individual fitness value Best individual fitness value


120 14

100 12

10
80
DE
Fitness value

Fitness value
DE 8
60 ADE ADE
6
40
4

20 2

0 0
0 20 40 60 80 100 0 20 40 60 80 100
The number of evolutionary The number of evolutionary

Figure 2: Sphere function. Figure 4: Griewank function.

Best individual fitness value Best individual fitness value


250 15

200
DE
10 DE
Fitness value

Fitness value

150
ADE
ADE
100
5

50

0 0
0 20 40 60 80 100 0 20 40 60 80 100
The number of evolutionary The number of evolutionary

Figure 3: Rastrigin function. Figure 5: Ackley function.

Best individual fitness value


but the difference is small and acceptable. For Shaffer func- 0.04
tion, there is no obvious superior algorithm.
For all the five functions, there is a significant improve- 0.035
ment as expected on the convergence time. These experimen-
tal results show that improving the algorithm can effectively 0.03
improve the convergence speed with excellent convergence
Fitness value

effect. 0.025 DE
The comparison of two methods with convergent curves
is shown in Figures 2, 3, 4, 5, and 6. The experiment results 0.02
ADE
show the ADE algorithm has better result. Compared with
DE, the ADE algorithm has both global search ability and fast 0.015
convergence speed.
0.01

6. Conclusion 0.005
0 20 40 60 80 100
The scale factor 𝐹 and cross-factor CR have a great impact The number of evolutionary
on the performance of the algorithm, such as the quality
of the optimal value and convergence rate. There is still no Figure 6: Shaffer’s function.
Journal of Control Science and Engineering 5

good way to determine the parameters. In this paper, we [11] C. T. Su and C. S. Lee, “Network reconfiguration of distribution
propose an adaptive parameter adjustment method according systems using improved mixed-integer hybrid differential evo-
to the evolution stage. From before mentioned experiment, lution,” IEEE Transactions on Power Delivery, vol. 18, no. 3, pp.
we can know the improved algorithm has more powerful 1022–1027, 2003.
global exploration ability and faster convergence speed and [12] M. F. Tasgetiren, P. N. Suganthan, T. J. Chua, and A. Al-Hajri,
can be widely used in other optimization tasks. “Differential evolution algorithms for the generalized assign-
ment problem,” in Proceedings of the IEEE Congress on Evo-
lutionary Computation (CEC ’09), pp. 2606–2613, Trondheim,
Conflict of Interests Norway, May 2009.
[13] W. G. Zhang, H. P. Chen, D. Lu, and H. Shao, “A novel dif-
The authors declare that there is no conflict of interests re-
ferential evolution algorithm for a single batch-processing
garding the publication of this paper. machine with non-identical job sizes,” in Proceedings of the 4th
International Conference on Natural Computation (ICNC ’08),
Acknowledgments pp. 447–451, Jinan, China, October 2008.
[14] T. Sum-Im, G. A. Taylor, M. R. Irvings, and Y. H. Song, “A
This work was supported by the National Natural Science differential evolution algorithm for multistage transmission
Foundation of China (Grant no. 61005052), the Fundamental expansion planning,” in Proceedings of the 42nd International
Research Funds for the Central Universities (Grant no. Universities Power Engineering Conference (UPEC ’07), pp. 357–
2010121068), and the Science and Technology Project of 364, Brighton, UK, September 2007.
Quanzhou (Grant no. 2012Z91). [15] Z. F. Wu, H. K. Huang, B. Yang, and Y. Zhang, “A modified dif-
ferential evolution algorithm with self-adaptive control param-
eters,” in Proceedings of the 3rd International Conference on
References Intelligent System and Knowledge Engineering (ISKE ’08), pp.
[1] M. Clerc and J. Kennedy, “The particle swarm-explosion, sta- 524–527, Xiamen, China, November 2008.
bility, and convergence in a multi-dimensional complex space,” [16] J. Liu and J. Lampinen, “A fuzzy adaptive differential evolution
IEEE Transactions on Evolutionary Computation, vol. 6, pp. 58– algorithm,” Soft Computing, vol. 9, no. 6, pp. 448–462, 2005.
73, 2002. [17] R. Gamperle, S. D. Muller, and P. Koumoutsakos, “A parameter
[2] R. Storn and K. Price, “Differential evolution—a simple and study for differential evolution,” in Proceedings of the Inter-
efficient heuristic for global optimization over continuous national Conference on Advances in Intelligent Systems, Fuzzy
spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341– Systems, Evolutionary Computation (WSEAS ’02), pp. 11–15,
359, 1997. Interlaken, Switzerland, February 2002.
[3] R. Storn and K. Price, “Differential evolution for multi-objective [18] J. Liu and J. Lampinen, “On setting the control parameter of
optimization,” Evolutionary Computation, vol. 4, pp. 8–12, 2003. the differential evolution method,” in Proceedings of the 8th
[4] H. K. Kim, J. K. Chong, K. Y. Park, and D. A. Lowther, Internatuional Conference on Soft Computing (MENDEL ’02),
“Differential evolution strategy for constrained global optimiza- pp. 11–18, Brno, Czech Republic, June 2002.
tion and application to practical engineering problems,” IEEE [19] J. Liu and J. Lampinen, “A fuzzy adaptive differential evolution
Transactions on Magnetics, vol. 43, no. 4, pp. 1565–1568, 2007. algorithm,” Soft Computing, vol. 9, no. 6, pp. 448–462, 2005.
[5] M. G. H. Omran and A. P. Engelbrecht, “Self-adaptive differen- [20] J. Brest, S. Greiner, B. Bošković, M. Mernik, and V. Zumer,
tial evolution methods for unsupervised image classification,” in “Self-adapting control parameters in differential evolution: a
Proceedings of the IEEE Conference on Cybernetics and Intelligent comparative study on numerical benchmark problems,” IEEE
Systems, pp. 1–6, Bangkok, Thailand, June 2006. Transactions on Evolutionary Computation, vol. 10, no. 6, pp.
[6] H. Dhahri and A. M. Alimi, “The modified differential evolution 646–657, 2006.
and the RBF (MDE-RBF) neural network for time series pre-
diction,” in Proceedings of the International Joint Conference on
Neural Networks 2006 (IJCNN ’06), pp. 2938–2943, Vancouver,
Canada, July 2006.
[7] S. Yang, Y. B. Gan, and A. Qing, “Sideband suppression in time-
modulated linear arrays by the differential evolution algorithm,”
IEEE Transactions on Antennas and Propagations Letters, vol. 1,
no. 1, pp. 173–175, 2002.
[8] A. Massa, M. Pastorino, and A. Randazzo, “Optimization of the
directivity of a monopulse antenna with a subarray weighting
by a hybrid differential evolution method,” IEEE Transactions
on Antennas and Propagations Letters, vol. 5, no. 1, pp. 155–158,
2006.
[9] V. Aslantas and M. Tunckanat, “Differential evolution algorithm
for segmentation of wound images,” in Proceedings of the IEEE
International Symposium on Intelligent Signal Processing (WISP
’07), Alcalá de Henares, Spain, October 2007.
[10] L. H. Wu, Y. N. Wang, X. F. Yuan, and S. W. Zhou, “Differential
evolution algorithm with adaptive second mutation,” Chinese
Journal of Control and Decision, vol. 21, no. 8, pp. 898–902, 2006.
International Journal of

Rotating
Machinery

International Journal of
The Scientific
Engineering Distributed
Journal of
Journal of

Hindawi Publishing Corporation


World Journal
Hindawi Publishing Corporation Hindawi Publishing Corporation
Sensors
Hindawi Publishing Corporation
Sensor Networks
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

Journal of

Control Science
and Engineering

Advances in
Civil Engineering
Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

Submit your manuscripts at


http://www.hindawi.com

Journal of
Journal of Electrical and Computer
Robotics
Hindawi Publishing Corporation
Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

VLSI Design
Advances in
OptoElectronics
International Journal of

International Journal of
Modelling &
Simulation
Aerospace
Hindawi Publishing Corporation Volume 2014
Navigation and
Observation
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
in Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
http://www.hindawi.com Volume 2014

International Journal of
International Journal of Antennas and Active and Passive Advances in
Chemical Engineering Propagation Electronic Components Shock and Vibration Acoustics and Vibration
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

You might also like