Particle Swarm Optimization With Gaussian Mutation: [email protected] - Ac.jp
Particle Swarm Optimization With Gaussian Mutation: [email protected] - Ac.jp
Mutation
Natsuki Higashi Hitoshi Iba
Graduate School of Frontier Sciences, Graduate School of Frontier Sciences,
Dept.of Frontier Infonnatics,Univ.of Tokyo Dept.of Frontier Infonnatics,Univ.of Tokyo
Hongo 7-3-1 Bunkyo-ku Tokyo,l13-8656,Japan Hongo 7-3-1 Bunkyo-ku Tokyo,l13-8656,Japan
Email: [email protected] Email: [email protected]
O-7803-7914-4/03/$10.0002003 IEEE
12
the prefixed limit, another restriction called "K,,,," is used. I ) Hybrid Particle Swarrii[8]: HPSO utilizes the mech-
Searching is possible while keeping the individuals inside the anism of PSO and the natural selection mechanism which
search area in this manner. is usually utilized by EC such as genetic algorithms (GAS).
The positions of respective individuals are updated by Since search procedure by PSO deeply depends on phest and
generation, and are expressed by the following equation. gbest, the searching area is limited by pbest and gbest. On the
contrary, by introduction of the natural selection mechanism,
the effect of pbest and gbest is gradually vanished by the
s; = 25 + L7i selection and a broader search area can be realized. Agent
positions with low evaluation values are replaced by those with
high evaluation values using the selection. On the contrary,
The GA is an optimization algorithm that simulates the pbest information of each agent is maintained. Therefore,
evolution of creatures. In nature, living organisms become intensive search in a current effective area and dependence on
extinct and those that have best adapted themselves to the the past high evaluation position are realized. Fig.] illustrates
environment survive and leave offspring. The fillest genes the searching process of HPSO.
spread within the group through repeated survival processes
for the prosperity of the swarm of a species. GA has the aim of
producing solutions (design variables) that give optimal values
with reference to an objective function on the computer, which
is modeled after creatures that have cleverly adapted them-
selves to the environment. In GA, the elements (solutions) in a
state space are expressed as individuals, and every individual
is composed of chromosomes in which design variables are
coded. The values of the objective function are calculated
by converting the chromosomes into design variables via
decoding. In this conversion process, the genotype is expressed
in terms of chromosomes, and an individual's character and
characteristics specified by genotype are called phenotype. A
set of individuals is called a population, which is likely to
select the fittest individuals so that those proven to have greater
fitness for the environment survive at higher probabilities in
the next generation among those that form a generation. The
next generation is formed through genetic operators such as
crossover and mutation for respective individuals. Solution
searching is pursued by repeating a series of these operations.
We expect that the number of individuals with higher fitness
(that is, those closer to optimal solutions) increases as the
search makes progress, thereby an optimum solution can be
achieved. The above describes the basic concept of GA. Fig. I. Concept of searching process by HPSO
GA usually expresses chromosomes in terms of bit strings
of {O!$l]. Although a search taking into account design 2) Hybrid Particle Swarm Optiniiser with Breeding and
variables is considered effective in the case of optimizing a Subpopiilations[4]: The structure of the hybrid model is
continuous function, we have a problem in that the continuity illustrated in Fig.2 .
on a state space is not necessarily reflected on bit string coding.
Accordingly, we use real number vectors for the chromosomes begin
initialize
in a real-valued GA. It has been reported that good solutions
whilehot terminal-eondition)do
are obtained from the real-valued GA in comparison to the bit begin
string GA, because the former allows us to make searches that evaluate
allow for the shapes of objective functions by adopting real ealeulate new velocity vectors
move
number values. breed
end
end
C. Previous Studies on the Combination ofEvolutionary Cal-
culation Technique with PSO
Fig. 2. The structure o f the hybrid model.
In previous articles, an algorithm called Hybrid Particle
Swarm, which combines ideas from evolutionary computation The position of the offspring is found for each dimension
with the PSO, has been proposed. by arithmetic crossover on the position o f t h e parents, i.e.,
) p,
d I i / d 2 ( : r f= X +
p f W r l l t ~ ( z ; ) (1 ~ p,) X parellfl(:l:;)
14
tions. The functions introduced by DeJong are often used as
quantitative evaluation means for the benchmark tests of GA
and other techniques. They are defined below. The standard
functions are used to find ininiinuin values. As can be seen
in following definition. f,, fs and fs are especially difficult
compared to other functions. This is because f4 contains noise
at each point, and f3, f~ has numerous peaks! %+GAUSS(0,
I ) of f., indicates the addition of values in accordance with the
normal distribution with the average of 0 and the dispersion
of I .
A . Ddoiig j.Standard Functions
. parabola
fi(x.y)=z2+y2
. Rosenbrock's saddle
f2(2,y) = 100(22 - y)2 + (1 - z)2
step function
=
~:J(z:Y) 1. + LYJ
quadratic with noise
fJ(z:y) = L . ' + ~ ~ ' + G A L I S S ( O , ~ )
. original Rastrigin function
f&y) = 20 + 22 - lOcos(27rz) + yz - 10cos(27ry)
. generalized Rastrigin function
TABLE I I
PARAMETERS.
Generation
41d2 upper limits were set lo 2.0
lnelfia weieht
Dilllension
Mutation 0.1
F"tlCti0" ;,,,tion
~
PSO PSO with Gaussian Mutalion
~
I
0
.
2m
.
am m
. .
a00
.
3 m nm
.
3.m
. 3-
. .Em
. I
XCa
U*,._
-0.8
I x5 1 0.533\ i-'
-0.5
i/
Fig. 13. The Target Network Fig. 14. Acquired Network by PSO with Gaussian Mutation.
TABLE V
PSO P A R A M E T E R S
78
TABLE VI1
a
171 1.Kennedy and W.M.Spcars. -'Matching Algorithms to Problems: An
F A PARAMETERS. Esperimental Test of the Particle S w a m and Some Genetic Algorithms
on thc Multimodal Prablrm Generator". Proceedings of the IEEE Int'l
Conference on Evolutionary Computation. 1998.
Mutation rate 81 p.AngelinC, Selection to I",pro,,e particle Swarm Opti",izolion",
5000 1 200 I 0.99 I 0.001 Proceedings of IEEE international Conference on Evolutionary Compo-
tation(lCEC1, 1998.
I91 A.H.Wright. "Genetic Algorithms for Real Parameter Optimization".
Foundations of Genetic Algorithms, Rawlinr.G.J.E.(ed.), Morgan Kauf-
mann. 1991
O.ii9 [IO] D.Tominaga N.Koga and M~Okamoto."Efficient Numerical Optimiza-
lion Algorithm Based on Genetic Algorithm far Inverse Problem". in
Proc. of Genetic and Evolutionary Computation Confrrence(GECCO2OOl.
0.460 2000.
( I I ] F van den Bergh and AP Engelbrechl, .'A New Locally Convergent
4.487 Panicle Swarm Optimizer". IEEE Conference on Systems, Man, and
Cybemetics, 2002.
0.76
0.818
ACKNOWLEDGMENT
REFERENCES
[I] S.Anda and H.lba, 'The Matrix Modeling ofGene Regulatory Networks
-Reverse Engineering by Genetic Algorithms-", Proceedings of Atlantic
Symposium on Computational Biology and Genome Information Systems
& Technology, 2001.
I21 P.J.Angcline, "Evolmionary Optimization Versus Panicle Swarm Opti-
milation: Philosophy and Performance Differences", Evolutionary Pro-
gramming Vll(1998). Lecture Notes in Computer Science 1447, 601-610.
Springer.
131 R.C.Eberhart and Y.Shi, "Comparison between Genetic Algorithms and
Particle Swarm Optimization", Evolutionary Programming VII( 1998).
Lecture Notes in Computer Science 1447, 61 1-616, Springer.
I41 M.Levbjerg T.K.Rasmussen and T.Knnk. ''Hybrid Panicle Swarm Op-
timiser with Breeding and Subpopulations", Proceedings of the Genetic
and Evolutionary Comutation Conference. ZWI
IS1 I.H.Osman and J.P.Kely. "Meta-Heuristics: Theory and Applications''.
Kluwer Academic Publishers, 1996.
I61 J.Kennedy and R.C.Eberhan, "Particle Swarm Optimization". Proceed-
ings of the 1995 IEEE the lntemational Conference on Neural Networks,
~01.4,1942-1948. IEEE Press.
19