0% found this document useful (0 votes)
41 views

Particle Swarm Optimization With Gaussian Mutation: [email protected] - Ac.jp

1) The document describes a new approach called Particle Swarm Optimization with Gaussian Mutation that combines concepts from PSO and evolutionary algorithms. 2) The approach is tested on benchmark functions and shown to achieve better performance than PSO and GA alone. 3) The approach is also applied to inferring a gene network, where it succeeds in acquiring better results than traditional methods.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

Particle Swarm Optimization With Gaussian Mutation: [email protected] - Ac.jp

1) The document describes a new approach called Particle Swarm Optimization with Gaussian Mutation that combines concepts from PSO and evolutionary algorithms. 2) The approach is tested on benchmark functions and shown to achieve better performance than PSO and GA alone. 3) The approach is also applied to inferring a gene network, where it succeeds in acquiring better results than traditional methods.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Particle Swarm Optimization with Gaussian

Mutation
Natsuki Higashi Hitoshi Iba
Graduate School of Frontier Sciences, Graduate School of Frontier Sciences,
Dept.of Frontier Infonnatics,Univ.of Tokyo Dept.of Frontier Infonnatics,Univ.of Tokyo
Hongo 7-3-1 Bunkyo-ku Tokyo,l13-8656,Japan Hongo 7-3-1 Bunkyo-ku Tokyo,l13-8656,Japan
Email: [email protected] Email: [email protected]

Absfrocl-In this paper we present Particle Swarm Optimiza- summarized as follows:


tion with Gaussian Mutation combining the idea of the particle 1) We implement an extended version of PSO, which
swarm with concepts from Evolutionary Algorithms. This method emplbys a mutation mechanism used in the real-valued
combines the traditional velocity and position update rules with
the ideas of Gaussian Mutation. This model is tested and com- CA.
pared with the standard PSO and standard GA. The comparative 2) We empirically show that PSO with Gaussian Mutation
experiments have been conducted on unimodal functions and achieves better performance than PSO and real-valued
multimodal functions. PSO with Gaussian Mutation is able to GA alone on several benchmark functions.
obtain the result superior to GA. We also apply the PSO with
Gaussian Mutation to a gene network. Consequently, it has 3) We apply the PSO with Gaussian Mutation to a real-
succeeded in acquiring the better results than those by GA and world task, i.e., the inference of a gene network. Conse-
PSO alone. quently, it has succeeded in acquiring better results than
those by the traditional method.
1. INTRODUCTION
The rest of this paper is organized as follows. In Section
At present, optimization algorithms such as genetic al- 11, we introduce the background of our research, i.e., details
gorithm (GA), genetic programming (GP), and evolutionaly of PSO and GA. The previous work on extensions of PSO
programming (EP) are well known. In this paper, we introduce is also discussed. Section 111 describes our approach, i.e.,
the concept of particle swarm optimization (PSO), which is the integration of PSO and Gaussian mutation. Benchmark
slightly different from these well-known algorithms, and verify functions are used to examine the effectiveness of our method.
the improvement of PSO by getting hints from GA. GA, GP, Their experimental results are shown in Section 1V. Then, the
PSO, and .simulated annealing (SA) are generically named real-world application of our approach is described in Section
"meta-heuristic." These new search techniques are structured V. We give some conclusion in Section VI.
on biological and physical phenomena.
The "Particle Swarm Optimization" algorithm is included 11. BACKGROUND OF T H E STUDY
in the field of swarm intelligence, and was first introduced by A . PSO Algorirhni
Russel C. Eberhart and James Kennedy in 1995 as a substitute
In the basic PSO proposed by Kennedy et al., many
for CA. The PSO algorithm was invented with reference to
individuals move around in a multi-dimensional space, and
bird flocks social behavior. Unlike the GA technique that em-
such a fundamental PSO is applicable to numerical issues
ploys genetic manipulations, subsequent actions of respective
[6] ! %Eachindividual memorizes the position vector! JC! mnd
individuals are influenced by their own movements and those
velocity vector! JtT !y as well as the spot ! JP; !Kat which
of their companions. It has been proven that PSO can perform
on even ground using GA techniques with reference to the the individual has acquired the best fitness. Furthermore,
problem of function optimization, based on studies after the respective individuals can share data at the best-fitness spot
!Jp; ! Kfor all individuals.
launch of the theory. A comparison between PSO and standard
The velocity of each individual is updated with the best
GA is shown in References [7],[3],and [ 2 ] .
Recently, as shown by the article [4] published by Morten positions acquired for all individuals over generations, and the
Lovbjerg et al. in 2001, many articles have pursued good best positions are acquired by the respective individuals over
oerformance bv combinine PSO with the concent of an evolu- generations. Updating is executed by the following formula.
L

tionary calculation technique. The technique proposed by them


is called "hybrid PSO," which integrates such techniques as
~ +
v; = x(uJv;+ @ I ' (6 5) $2 ' (P; - 35))
subpopulation and breeding into the concept of the velocity where ,y (random numbers from 0.9 to 1.0) is the constric-
and the position of the pure PSO algorithm. tion coefficient and i~ is the inertia weight. and 4 2 indicate
Following these previous studies, we introduce a new a p random numbers inherent to respective individuals and the
proach to PSO. The main contributions of this paper can be dimensions, and their upper limit is 2. If the velocity exceeds

O-7803-7914-4/03/$10.0002003 IEEE
12
the prefixed limit, another restriction called "K,,,," is used. I ) Hybrid Particle Swarrii[8]: HPSO utilizes the mech-
Searching is possible while keeping the individuals inside the anism of PSO and the natural selection mechanism which
search area in this manner. is usually utilized by EC such as genetic algorithms (GAS).
The positions of respective individuals are updated by Since search procedure by PSO deeply depends on phest and
generation, and are expressed by the following equation. gbest, the searching area is limited by pbest and gbest. On the
contrary, by introduction of the natural selection mechanism,
the effect of pbest and gbest is gradually vanished by the
s; = 25 + L7i selection and a broader search area can be realized. Agent
positions with low evaluation values are replaced by those with
high evaluation values using the selection. On the contrary,
The GA is an optimization algorithm that simulates the pbest information of each agent is maintained. Therefore,
evolution of creatures. In nature, living organisms become intensive search in a current effective area and dependence on
extinct and those that have best adapted themselves to the the past high evaluation position are realized. Fig.] illustrates
environment survive and leave offspring. The fillest genes the searching process of HPSO.
spread within the group through repeated survival processes
for the prosperity of the swarm of a species. GA has the aim of
producing solutions (design variables) that give optimal values
with reference to an objective function on the computer, which
is modeled after creatures that have cleverly adapted them-
selves to the environment. In GA, the elements (solutions) in a
state space are expressed as individuals, and every individual
is composed of chromosomes in which design variables are
coded. The values of the objective function are calculated
by converting the chromosomes into design variables via
decoding. In this conversion process, the genotype is expressed
in terms of chromosomes, and an individual's character and
characteristics specified by genotype are called phenotype. A
set of individuals is called a population, which is likely to
select the fittest individuals so that those proven to have greater
fitness for the environment survive at higher probabilities in
the next generation among those that form a generation. The
next generation is formed through genetic operators such as
crossover and mutation for respective individuals. Solution
searching is pursued by repeating a series of these operations.
We expect that the number of individuals with higher fitness
(that is, those closer to optimal solutions) increases as the
search makes progress, thereby an optimum solution can be
achieved. The above describes the basic concept of GA. Fig. I. Concept of searching process by HPSO
GA usually expresses chromosomes in terms of bit strings
of {O!$l]. Although a search taking into account design 2) Hybrid Particle Swarm Optiniiser with Breeding and
variables is considered effective in the case of optimizing a Subpopiilations[4]: The structure of the hybrid model is
continuous function, we have a problem in that the continuity illustrated in Fig.2 .
on a state space is not necessarily reflected on bit string coding.
Accordingly, we use real number vectors for the chromosomes begin
initialize
in a real-valued GA. It has been reported that good solutions
whilehot terminal-eondition)do
are obtained from the real-valued GA in comparison to the bit begin
string GA, because the former allows us to make searches that evaluate
allow for the shapes of objective functions by adopting real ealeulate new velocity vectors
move
number values. breed
end
end
C. Previous Studies on the Combination ofEvolutionary Cal-
culation Technique with PSO
Fig. 2. The structure o f the hybrid model.
In previous articles, an algorithm called Hybrid Particle
Swarm, which combines ideas from evolutionary computation The position of the offspring is found for each dimension
with the PSO, has been proposed. by arithmetic crossover on the position o f t h e parents, i.e.,
) p,
d I i / d 2 ( : r f= X +
p f W r l l t ~ ( z ; ) (1 ~ p,) X parellfl(:l:;)

where p , is a unifonnly distributed random value between 0


and 1 . The velocity vectors of the offspring is calculated as
the sum of the velocity vectors of the parents normalized to
Fig. 3. Concept ofsearching process by PSO with Gaussian Mutation. \'rhcsr
the original length of each parent velocity vector. : velocity based on Ibert,V&., : velocity based on gbest.

The motivation behind the crossover is that offspring parti-


cles benefit from both parents. In theory this allows good
examination of the search space between particles. Having two
particles on different suboptimal peaks breed could result in
an escape from a local optimum, and thus aid in achieving a
better one.

Fig. 4. Concept of searching process by PSO wilh Gaussian Mutation for


111. PROPOSED TECHNIQUE multidimentions. V,b,,9,. velocity based on Ibest,Vgo,,, : velocily based on
gbest.
Based on the above-mentioned previous study, we im-
plement an extended version of PSO, which integrates a
mechanism from real-valued GAS, while keeping the PSO B. introduction of Murarion via Gaussian Mutation
advantages, in our technique. We integrate a mutation often used for GA into PSO.
However, we should not follow the process by which every
A. Problenis of Hybrid Particle Swarm individual of the simple PSO moves to another position inside
the search area with a predetermined probability without being
As shown in the Fig.1, searching via HPSO is conducted affected by other individuals, but leave a certain ambiguity in
so that individuals move from spots with poor perfonnance to the transition to the next generation due to Gaussian mutation.
spots with gcmd perfonnance. The efficiency of this type of This technique employs the following equation:
search is surelv, hieh- because focused searchine is available
near ootimal solutions in a relativelv simole search snace!
- mut(z)= z x ( 1 + gnussian(u))
It is clear, however, that good performance is not produced U is set to be 0.1 times the length of the search space in one
because the PSO's original problem of being trapped in a dimension. it's good exactly from experience obtained in some
local minimum is not solved with regard to a multimodal that
experiments. z is a numerical value which an each object has.
has many sharp peaks. In addition, this technique lacks the The individuals are selected at the predetermined probability
capacity to produce multiple minimum as for speciation. and their positions are determined at the probability under
The crossover in the HPSO with Breeding and Subpopu- the Gaussian distribution. Wide-ranging searches are possible
lation is in no way a superior crossover taking into account at the initial search stage and search efficiency is improved
the fact that it is preferable to have the capacity for producing at the middle and final stages by gradually reducing the
as many diversified solutions as possible under the restrictive appearance ratio of Gaussian mutation at the initial stage(see
condition of the "heredity of statistics values." Fig.3). Linealy decreasing this rate starting at 1.0 and ending
To solve the above difficulty, we propose a new approach to at 0 is used. When the object of evaluation is multidimensional
PSO in the next section. Compared to HPSO, this technique problem, a certain whole surface is chosen at random as shown
maintains PSO's typical advantages, although it has hints from in Fig.4.
the evolutionary calculation technique like these counterparts.
Our proposal is capable of producing multiple minimum as for 1V. PROBLEMSETTING
speciation while avoiding being trapped in the local minimum To check the effectiveness of these algorithms, we per-
due to Gaussian mutation. formed a simple experiinent. First, we defined standard func-

14
tions. The functions introduced by DeJong are often used as
quantitative evaluation means for the benchmark tests of GA
and other techniques. They are defined below. The standard
functions are used to find ininiinuin values. As can be seen
in following definition. f,, fs and fs are especially difficult
compared to other functions. This is because f4 contains noise
at each point, and f3, f~ has numerous peaks! %+GAUSS(0,
I ) of f., indicates the addition of values in accordance with the
normal distribution with the average of 0 and the dispersion
of I .
A . Ddoiig j.Standard Functions
. parabola
fi(x.y)=z2+y2
. Rosenbrock's saddle
f2(2,y) = 100(22 - y)2 + (1 - z)2

step function
=
~:J(z:Y) 1. + LYJ
quadratic with noise
fJ(z:y) = L . ' + ~ ~ ' + G A L I S S ( O , ~ )
. original Rastrigin function
f&y) = 20 + 22 - lOcos(27rz) + yz - 10cos(27ry)
. generalized Rastrigin function

Search space ranges for the experiments are listed in Table],


and PSO parameters are listed in Tablell.
TABLE I
S E A R C I I SPACE FOR E A C l l TEST FUNCTIONS. Fig. 6 . Standard PSO versus PSO with Gaussian Mutation for F2.

Function Search spacc


-5.11 5 zi 5 5.11
-5.11 5 z, < 5.11
-1.27 5 r , 5 1.28
J5 -5.11 5 zi 5 5.11
Js -10 5 2, 5 10

TABLE I I
PARAMETERS.

Parallleten PS0,PSO with Gaussian


Population

Generation
41d2 upper limits were set lo 2.0
lnelfia weieht
Dilllension
Mutation 0.1

Fig. 7. Standard PSO versus PSO with Gaussian Mutation for F3


A total 20 runs for each experiment were conducted.
Table 111 shows value data in these experiments.
TABLE 111
A V ~ R A CLIEST
E 20 RUNS
FITNESSOF FOR E X P E R I M E N T S

F"tlCti0" ;,,,tion
~
PSO PSO with Gaussian Mutalion
~

Plrabola I 97.81 152 101.3936


400 0.079319 0.04926
ROO 0.001493 0.000209
1200 o.no0608 I .62E-05
16011 0.000439 7.04E-07
~
2000 0.00025 6.24E-OR
Kosenbrock I 13.3093 13,04373
400 9.97E-05 0.00186
ROO 5.84E-05 o.000649
I200 5.84E-05 0.000477
1600 5.836-05 n.0~271
~
2000 5.83~-05 6.668-05
Slep I -25.6 -24.85
400 -66.6 -70.85
xnn -66.6 -71.4
1200 -66.6 -71.4
Fig. 8. Standard PSO versus PSO with Gaussian Mutation for F4. 1600 -66.6 -71.4
~
2000 -66.65 -71.4
Quadratic I 38,44913 32.45282
400 8.288-05 2.948-05
800 7.90~-OR 3.79E-09
I200 1.27E-08 2.SIE-l I
I600 4.698.10 1.30~-13
~
2000 1.53E-I I 8.458-16
Original I 280.5407 275.3716
400 46.76343 64.97984
ROO 14.27824 22.42683
I200 9.348036 14.0253
1600 5.163357 5.895667
~
2000 3.436234 1.759222
Generalized I 1.09745 1.09415
400 I.000093 I.ono1142
800 i.oooni5 1
1200 i.oonno~ I
1600 ~.onnooa I
~
2000 1.000008 1

I
0
.
2m
.
am m
. .
a00
.
3 m nm
.
3.m
. 3-
. .Em
. I
XCa
U*,._

B. Comparison ofResults between PSO and PSO with Gaus-


Fig. 9. Standard PSO versus PSO with Gaussian Mutation for F5. sian Mutation
The performance results are shown in Figs.5-IO, which plot
the fitness values against the number of generations. With
reference to the results above, PSO with Gaussian Mutation
performed better than normal PSO concerning the f, to f6
unimodals. Whereas, it turns out that PSO with Gaussian Mu-
tation is inferior to normal PSO in an early stage concerning
the fi,f5 functions. One of the major causes is the interaction
of the normal PSO algorithm acting inside the population. As
for unimodals that have only one peak, individuals have effects
on and pull each other, thereby enabling us to clearly find the
best fit values or those close to them. As for multimodals,
when an individual finds a spot with relatively good perfor-
mance, other individuals are attracted by the individual that
has good performance, even if better fitness values exist for
other individuals, resulting in a local minimum. It follows that
am ,m ,m 3m
. 1- 1- 2 m
x
..- getting away from this problem is very difficult because no
means such as mutation is at hand. The are solutions(see [ 1 I]).
Fi$. IO. Standard PSO versus PSO with Gaussian Mutation for F6. Table 11 shows the parameters with results in Table111 used
in the experiment. According to the previous results, the
combination of PSO with Gaussian Mutation allows us to
achieve better performance than normal PSO for the unimodals positive or negative values and the degree of control. In this
and multimodals. weighted network, a variety of environmental variables for the
experiment can be modeled, and the time-series representation
v. A P P L I C A T I O NOF T H E IMPROVED-VERSION PSO TO of all genes can be generated out of arbitrary initial values of
REAL-WORLD
PROBLEMS the Same unit as that for the micro-alley experiment.
A. Applicafioii 10 rhe Inference o f a Gene Network
1) Gene N e i w 0 , 4 The gene is a biological concept to
express the factor that expresses genetic characteristics of
organisms. At present, it is known that DNA is the basic
transfer substance of genetic information, and proteins pro-
duced and built on such information create genetic characters.
A base sequence corresponding to one protein is called a
gene. DNA as base sequence holds genetic data, but a very
complicated generative system is necessary to allow the data
to be converted into the components of bodies such as organs.
The mechanism may be explained by a gene network in which
Fig. 12. A Sample Gene Network
numerous genes control each other.
Fig. I I shows an example of a gene network (flow of DNA
!ARNA !Aprotein ! Ametabolic product). The substances are TABLE IV
involved in several reactions and even a reaction has chdin- WtlGllT M A T R I XOF S A M P L E NETWORK.
like effects on the whole generative system. The presumption

-0.8

In this model, the behavior of the gene N is determined


as follows. The given initial representation of each gene is
defined as z,(O)(i= 1.2...., N). The state quantity S,(t) of
the gene i is determined by summing the representation level
z,(t) of control gene j. wjz is the weight of the control route
matrix.
Fiz. II. A Typical Gene Circuitry[lO]

of the gene network is an inverse problem to solve the


structure of the network which has complicated connections
from the representation of each gene. The inverse problem
of a dynamic network with many variables is very difficult.
As the number of parameters increases, the search area, the
+ +
The representalion z,(t 1) at the time t 1 is determined
by the state quantity S ; ( t ) ,provided that mi is the maximum
complexity of the problem, and the amount of necessary data
representation of the gene i. The time-scrics expression pattern
increase exponentially.
is obtained by substituting the representation of all genes into
The difficulty of the problem partly derives from data
characteristics. With the present technology, the noise level the recurrence formula sequentially.
3) Sefring of Fitness ; Searching by PSO is executed with
included in the DNA micro-array experiment data is said to
reach 30% to 200%. Time-series data obtained in a reaction the expression pattern as an input, and the evaluation of each
individual is determined by the error in its expression pattern.
process is very short, thus allowing us to utilize only ten or
more pieces of discrete data ! % The experiment is not cheap, The sum (6) of absolute errors at all the data points is obtained
and the environment for easily repeating experiments has not by comparing the representation 2;(t) produced under the two
been arranged. Accordingly, we are obliged to accept improper equations above to the target expression pattern yi(t)! %
data with different error levels and incomplete data. N T
2) Modeling oJ Gene Network : We used a gene network 6= 1IYt(1) - G(t)I
i=o t=o
with a quasi-linear model using real-valued property values in
this experiment. Fig.12 shows an example of a simple gene 4) PSO-based Inference Model: We made an experiment
network in the form of an effective graph. The network in to infer the original correlation matrix from the expression
Fig.12 is expressed as a correlation matrix (w)in table IV. pattern of the following Fig.13.
The control route is expressed by input direction and weight The chromosome length of a gene network having the
q E !The % weight indicated between the nodes express as number of genes ( N ) is N*dimentianal problem, and we
n n

I x5 1 0.533\ i-'
-0.5

i/

Fig. 13. The Target Network Fig. 14. Acquired Network by PSO with Gaussian Mutation.

TABLE V
PSO P A R A M E T E R S

Population 1 Generation I V,,,, I lnenia weight I Mutation


5000 I 2w I 1.0 I 0.9 I 0.1

assume the architecture of this network. In the PSO-based


assumption, we used the contraction mechanism so that a
structure with less control routes is likely to be selected.
5 = (C+ zT)/Constnnt
The network with too many connections will show a chaotic
behavior, which is not consistent with biological data. Thus, Fig. IS. The Expression Pattern of the Acquired Network
the above assumption is justified for the biological validity.
For each parameter, we pursued a target expression pattern
with the maximum representation nii = 5.0 and initial They employed a Steady-State strategy with parameters in
concentration n(0)of (S,O,O,O,O). Moreover, an error within Table V11. The 20 runs were conducted without multi-stage
10% of absolute values was added. setup(see [ I ] for details).
The parameter error of the most eminent network obtained They reported that the best parameter error was 9.I%(see
with our method was 8.52%. The network is shown in Fig.14. TableVIl for GA Parameters). The acquired topology is given
All routes (connections among the nodes) were accurately in Fig.17. By comparing Fig.14 and Fig.17, we observe that
inferred. Taking into account the 10% error of absolute values both methods are successful in acquiring the precise causal
added to the target representation of the gene, the experimental relationships in a gene network. However, as can be seen from
results above proved good performance. On the other hand, the Table VI, we can confirm the slight superiority of our approach
standard PSO gave the error of 12.54%. The acquired structure in terms of error estimate.
included several wrong guesses, i.e., false positive and false
negative links. VI. CONCLUSIONS
Table VI summarizes the result of comparison with both In this article, we proposed a technique in which PSO was
methods for 30 runs. Consequently, The expression pattems combined with Gaussian Mutation in the evolutionay calcu-
of the aquired network by PSO with Gaussian Mutation and
the target network became almost the same.
TABLE VI
RESULTOF COYPARISON (AX'ERAGED
ERROR)

PSOwith Gaussian I PSO I CA


8.52% 1 12.54% 19.1%
~0.341

5) Comparison wirh the GA-based Inference Model[l]:


Ando and Iba conducted an experiment in inferring a target
network of Fig.13. They translated a weight matrix(e.g.Table
IV) into a one-dimentional chromosome and used a real-
valued GA by one-point crossover and Gaussian mutation. Fig. 16. Acqured Network by PSO.

78
TABLE VI1

a
171 1.Kennedy and W.M.Spcars. -'Matching Algorithms to Problems: An
F A PARAMETERS. Esperimental Test of the Particle S w a m and Some Genetic Algorithms
on thc Multimodal Prablrm Generator". Proceedings of the IEEE Int'l
Conference on Evolutionary Computation. 1998.
Mutation rate 81 p.AngelinC, Selection to I",pro,,e particle Swarm Opti",izolion",
5000 1 200 I 0.99 I 0.001 Proceedings of IEEE international Conference on Evolutionary Compo-
tation(lCEC1, 1998.
I91 A.H.Wright. "Genetic Algorithms for Real Parameter Optimization".
Foundations of Genetic Algorithms, Rawlinr.G.J.E.(ed.), Morgan Kauf-
mann. 1991
O.ii9 [IO] D.Tominaga N.Koga and M~Okamoto."Efficient Numerical Optimiza-
lion Algorithm Based on Genetic Algorithm far Inverse Problem". in
Proc. of Genetic and Evolutionary Computation Confrrence(GECCO2OOl.
0.460 2000.
( I I ] F van den Bergh and AP Engelbrechl, .'A New Locally Convergent
4.487 Panicle Swarm Optimizer". IEEE Conference on Systems, Man, and
Cybemetics, 2002.
0.76

0.818

Fig. 17. Acquired Network by GA

lation technique, and verified its effectiveness using DeJong's


standard functions. The improved-version PSO is expected
to be an advanced algorithm that has inherited the features
of both: simple PSO is strong at unimodals and the real-
valued GA is strong at multimodals. At the end, the real-
world application to the inference of a gene network showed
its validity. In previous works, GA was applied to this task.We
integrated GA with PSO so that the proposed method gave
better performance than either G A or PSO.
In future work, we plan to apply this technique to more
complicated and difficult problems. We would like to develop
other algorithms that perform well by combining the HPSO
concept with the proposed algorithm.

ACKNOWLEDGMENT

This work was partially supported by the Grants-in-Aid for


Scientific Research on Priority Areas (C), "Genome Informa-
tion Sciences" (No.12208004) from the Ministry of Education,
Culture, Sports, Science and Technology in Japan.

REFERENCES
[I] S.Anda and H.lba, 'The Matrix Modeling ofGene Regulatory Networks
-Reverse Engineering by Genetic Algorithms-", Proceedings of Atlantic
Symposium on Computational Biology and Genome Information Systems
& Technology, 2001.
I21 P.J.Angcline, "Evolmionary Optimization Versus Panicle Swarm Opti-
milation: Philosophy and Performance Differences", Evolutionary Pro-
gramming Vll(1998). Lecture Notes in Computer Science 1447, 601-610.
Springer.
131 R.C.Eberhart and Y.Shi, "Comparison between Genetic Algorithms and
Particle Swarm Optimization", Evolutionary Programming VII( 1998).
Lecture Notes in Computer Science 1447, 61 1-616, Springer.
I41 M.Levbjerg T.K.Rasmussen and T.Knnk. ''Hybrid Panicle Swarm Op-
timiser with Breeding and Subpopulations", Proceedings of the Genetic
and Evolutionary Comutation Conference. ZWI
IS1 I.H.Osman and J.P.Kely. "Meta-Heuristics: Theory and Applications''.
Kluwer Academic Publishers, 1996.
I61 J.Kennedy and R.C.Eberhan, "Particle Swarm Optimization". Proceed-
ings of the 1995 IEEE the lntemational Conference on Neural Networks,
~01.4,1942-1948. IEEE Press.

19

You might also like