0% found this document useful (0 votes)
55 views10 pages

SITutorial

Individuals influence each other and become more similar over time, developing correlated beliefs within a population. This leads to polarization, with homogeneous groups differing from one another. Social behavior increases an individual's ability to adapt and problem solve, relating to intelligence. Particle swarm optimization is an evolutionary computation technique inspired by swarm intelligence in nature. It involves a population of potential solutions interacting and adapting to their environment to collectively find an optimal solution.

Uploaded by

Jagadesh Subash
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views10 pages

SITutorial

Individuals influence each other and become more similar over time, developing correlated beliefs within a population. This leads to polarization, with homogeneous groups differing from one another. Social behavior increases an individual's ability to adapt and problem solve, relating to intelligence. Particle swarm optimization is an evolutionary computation technique inspired by swarm intelligence in nature. It involves a population of potential solutions interacting and adapting to their environment to collectively find an optimal solution.

Uploaded by

Jagadesh Subash
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Swarm Intelligence

Special thanks to:


Russell C. Eberhart
Jim Kennedy
Chairman, Department of Electrical and Computer Engineering
Purdue School of Engineering and Technology at IUPUI
Bureau of Labor Statistics
Washington, DC
Vice President, Computelligence LLC

Indianapolis, Indiana, USA


[email protected]

Outline of Presentation A Social Psychology Paradigm Tour


z A Social Society Paradigm Tour
z A Brief Tour of Evolutionary Computation z Latané’s dynamic social impact theory
z Introduction to Particle Swarm Optimization z Axelrod’s culture model
z Evolving Fuzzy Systems z Kennedy’s adaptive culture model
z Evolving Artificial Neural Networks
z Examples of Recent Applications

Latané’s Dynamic Social Impact Dynamic Social Impact Theory


Theory Characteristics
z Consolidation: Opinion diversity is reduced as individuals
z Behaviors of individuals can be explained in terms of the are exposed to majority arguments
self-
self-organizing properties of their social system z Clustering: Individuals become more like their neighbors in
z Clusters of individuals develop similar beliefs social space
z Subpopulations diverge from one another (polarization] z Correlation: Attitudes that were originally independent
tend to become associated
z Continuing diversity: Clustering prevents minority views
from complete consolidation

1
Dynamic Social Impact Theory:
Axelrod’s Culture Model
Summary
z Populations of individuals are pictured as strings of
z Individuals influence one another, and in doing so symbols, or “features”
become more similar
z Probability of interaction between two individuals is a
z Patterns of belief held by individuals tend to correlate function of their similarity
within regions of a population
z Individuals become more similar as a result of
z This model is consistent with findings in the fields of interactions
social psychology, sociology, economics, and
z The observed dynamic is polarization , homogeneous
anthropology.
subpopulations that differ from one another

Kennedy’s Adaptive Culture Model Culture and Cognition Summary

z No effect of similarity on probability of interaction z Individuals searching for solutions learn from the
z The effect of similarity is negative, in that it is experiences of others (individuals learn from their
dissimilarity
dissimilarity that creates boundaries between cultural neighbors)
regions z An observer of the population perceives phenomena of
z Interaction occurs if fitnesses are different which the individuals are the parts (individuals that
interact frequently become similar)
z Culture affects the performance of individuals that
comprise it (individuals gain benefit by imitating their
neighbors)

A Brief Tour of Evolutionary


So, what about intelligence?
Computation
z Social behavior increases the ability of an individual
to adapt z Evolutionary computation: Machine learning
optimization and classification paradigms roughly
z There is a relationship between adaptability and
based on mechanisms of evolution such as biological
intelligence
genetics and natural selection
z Intelligence arises from interactions among
individuals

2
Features of Evolutionary Computation
Evolutionary Computation Algorithms
(EC) Paradigms
z EC paradigms utilize a population of points (potential 1. Initialize the population
solutions) in their search 2. Calculate the fitness of each individual in the
z EC paradigms use direct “fitness” information instead of population
function derivatives or other related knowledge
3. Reproduce selected individuals to form a new
z EC paradigms use probabilistic, rather than population
deterministic, transition rules
4. Perform evolutionary operations such as crossover and
mutation on the population
5. Loop to step 2 until some condition is met

Evolutionary Computation Paradigms SWARMS

Genetic algorithms (GAs Coherence without


z (GAs)) - John Holland
choreography
z Evolutionary programming (EP) - Larry Fogel
z Evolution strategies (ES) - I. Rechenberg
Bonabeau, Millonas,
z Genetic programming (GP) - John Koza J.-L. Deneubourg, Langton,
z Particle swarm optimization (PSO) - Kennedy & Eberhart etc.

Particle swarms
(physical position not a factor)

Intelligent Swarm Basic Principles of Swarm Intelligence


(Mark Millonas,
Millonas, Santa Fe Institute)
z Proximity principle: the population should be able to
z A population of interacting individuals that optimizes carry out simple space and time computations
a function or goal by collectively adapting to the local z Quality principle: the population should be able to
and/or global environment respond to quality factors in the environment
z Swarm intelligence ≅ collective adaptation z Diverse response principle: the population should not
commit its activities along excessively narrow channels
z Stability principle: the population should not change its
mode of behavior every time the environment changes
z Adapability principle: the population must be able to
change behavior mode when it’s worth the computational
price

3
Introduction to Particle Swarm Introduction to Particle Swarm
Optimization Optimization (PSO), Continued
z A concept for optimizing nonlinear functions
z Has roots in artificial life and evolutionary computation
z A “swarm” is an apparently disorganized collection
(population) of moving individuals that tend to cluster z Developed by Kennedy and Eberhart (1995)
together while each individual seems to be moving in z Simple in concept
a random direction z Easy to implement
z We also use “swarm” to describe a certain family of z Computationally efficient
social processes z Effective on a variety of problems

Evolution of PSO Concept and Features of Particle Swarm


Paradigm Optmization
z Discovered through simplified social model simulation z Population initialized by assigning random positions and
z Related to bird flocking, fish schooling, and swarming velocities; potential solutions are then flown through
theory hyperspace.
z Related to evolutionary computation; some similarities to z Each particle keeps track of its “best” (highest fitness)
genetic algorithms and evolution strategies position in hyperspace.
z Kennedy developed the “cornfield vector” for birds z This is called “pbest
“pbest”” for an individual particle
seeking food z It is called “gbest
“gbest”” for the best in the population
z It is called “lbest
“lbest”” for the best in a defined neighborhood
z Bird flock became a swarm
z Expanded to multidimensional search z At each time step, each particle stochastically
accelerates toward its pbest and gbest (or lbest).
lbest).
z Incorporated acceleration by distance
z Paradigm simplified

Particle Swarm Optimization Process PSO Velocity Update Equations

1. Initialize population in hyperspace. z Global version:


2. Evaluate fitness of individual particles. vid = wi vid + c1rand ()( pid − xid ) + c2 Rand ()( p gd − xid )
3. Modify velocities based on previous best and global
(or neighborhood) best. xid = xid + vid
4. Terminate on some condition. Where d is the dimension, c1 and c2 are positive constants,
5. Go to step 2. rand and Rand are random functions, and w is the inertia
weight.
For neighborhood version, change pgd to pld.

4
PSO Adherence to Swarm Intelligence
Further Details of PSO
Principles
z Performance of each particle measured according to a z Proximity: n-dimensional space calculations carried out
predefined fitness function. over series of time steps
z Inertia weight influences tradeoff between global and z Quality: population responds to quality factors pbest
local exploration. and gbest (or lbest )
z Good approach is to reduce inertia weight during run z Stability: population changes state only when gbest (or
(i.e., from 0.9 to 0.4 over 1000 generations) lbest ) changes
z Usually set c1 and c2 to 2 z Adaptability: population does change state when gbest
z Usually set maximum velocity to dynamic range of (or lbest ) changes
variable

Benchmark Tests Evolving Fuzzy Systems

z Develop (evolve) fuzzy expert systems using


z De Jong’s test set evolutionary algorithms such as GA or PSO
z Schaffer’s F6 function z Evolve rules
z Evolve neural network weights z Evolve membership function types
z Iris data set z Evolve membership function locations
z Electric vehicle state of charge system z In turn, adapt parameters of the EA using fuzzy rules
z Over 20 other benchmark functions tested z For example: “If variance of fitness is low, set mutation rate high”
high”

Journal Paper Evolving Artificial Neural Networks: Outline

z Introduction
z Definitions and review of previous work
“Implementation of Evolutionary Fuzzy Systems”
z Advantages and disadvantages of previous
Authors: Shi, Eberhart, Chen approaches
IEEE Transactions on Fuzzy Systems z Using particle swarm optimization (PSO)
April 1999 z An example application
z Conclusions

5
Evolving Neural Networks with Particle
Introduction
Swarm Optimization
z Neural networks are very good at some problems, z Evolve neural network capable of being universal
such as mapping input vectors to outputs approximator, such as backpropagation or radial basis
z Evolutionary algorithms are very good at other function network.
problems, such as optimization z In backpropagation, most common PE transfer function
z Hybrid tools are possible that are better than either is sigmoidal function: output = 1/(1 + e - input )
approach by itself z Eberhart, Dobbins, and Simpson (1996) first used PSO
z Review articles on evolving neural networks: to evolve network weights (replaced backpropagation
Schaffer, Whitley, and Eshelman (1992); Yao (1995); learning algorithm)
and Fogel (1998) z PSO can also be used to indirectly evolve the structure
z Evolutionary algorithms usually used to evolve of a network. An added benefit is that the preprocessing
network weights, but sometimes used to evolve of input data is made unnecessary.
structures and/or learning algorithms

Evolving Neural Networks with Particle Evolving the Network Structure with
Swarm Optimization, Continued PSO
z Evolve both the network weights and the slopes of z If evolved slope is sufficiently small, sigmoidal output can
sigmoidal transfer functions of hidden and output PEs.
PEs. be clamped to 0.5, and hidden PE can be removed.
z If transfer function now is: output = 1/(1 + e -k*input ) Weights from bias PE to each PE in next layer are
increased by one-
one-half the value of the weight from the
then we are evolving k in addition to evolving the
PE being removed to the next-
next-layer PE. PEs are thus
weights.
pruned, reducing network complexity.
z The method is general, and can be applied to other
z If evolved slope is sufficiently high, sigmoid transfer
topologies and other transfer functions.
function can be replaced by step transfer function. This
z Flexibility is gained by allowing slopes to be positive or works with large negative or positive slopes. Network
negative. A change in sign for the slope is equivalent to computational complexity is thus reduced.
a change in signs of all input weights.

Evolving the Network Structure with


PSO, Continued
Tracking and Optimizing Dynamic
z Since slopes can evolve to large values, input Systems with Particle Swarms
normalization is generally not needed. This
simplifies applications process and shortens
development time.
z The PSO process is continuous, so neural Acknowledge:
network evolution is also continuous. No Yuhui Shi and Xiaohui Hu
sudden discontinuities exist such as those
that plague other approaches.
z This approach is now protected by a U. S.
Patent

6
Outline Original Version with Inertia Weight

z
z
Brief review of particle swarm optimization
Types of dynamic systems
(
vid = wi vid + c1rand ()( pid − xid ) + c2 Rand () p gd − xid )
z Practical application requirements xid = xid + vid
z Previous work Where d is the dimension, c1 and c2 are positive constants,
z Experimental design rand and Rand are random functions, and w is the inertia
z Results weight. For neighborhood version, change pgd to pld.
z Conclusions and future effort

Constriction Factor Version Dynamic System Types


vid = K*[vid + c1 * rand() * (pid - xid) +
c2 * Rand() * (pgd - xid )] z Location of optimum value can change
z Optimum value can vary
2
K= z Number of optima can change
2 -φ - φ 2 − 4φ z Combinations of the above can occur

where φ = c1 + c2, φ > 4


In this project, we varied the location of the
(φ was set to 4.1, so K = .729) optimum.

Practical Application Requirements Previous Work

z Few practical problems are static; most are dynamic z Testing Parabolic Function
z Most time is spent re-
re-optimizing (re-
(re-scheduling, etc.) N
z Many systems involve machines and people error = ∑ ( xi − offset ) 2
z These systems have inertia
i =1
z 10-
10-100 seconds often available for re-
re-optimization
z Offset = offset + severity
z Eberhart’s Law of Sufficiency applies: If the solution is
good enough, fast enough, and cheap enough, then it is z Severity 0.01, .1, .5
sufficient z 2000 evaluations per change
z 3 dimensions, dynamic range –50 to +50

7
Previous Work: References Experimental Design

z Angeline,
Angeline, P.J. (1997) Tracking extrema in dynamic z Two possibilities with swarm
environments. Proc. Evol.
Evol. Programming VI, VI, z Continue on from where we were
Indianapolis, IN, Berlin: Springer-
Springer-Verlag,
Verlag, pp. 335-
335-345 z Re-
Re-initialize the swarm
z Bäck,
ck, T. (1998). On the behavior of evolutionary z Inertia weight of [0.5+(Rnd/2.0)] used
algorithms in dynamic environments. Proc. Int. Conf. on
Evol. z 20 particles; update interval of 100 generations
Evol. Computation,
Computation, Anchorage, AK. Piscataway, NJ:
IEEE Press, pp. 446-
446-451 z When change occurred:
z Retained the position of each particle
z Reset values of pbest (also of gbest)
gbest)

PSO average best over all runs PSO average best over all runs
Severity = 0.5 Severity = 0.1
Three dimensions Three dimensions
10000
10000
1000

1000
100

100
10
Average best value over all runs

10
1

1 0.1

0.1 0.01

0.01 0.001

0.001 0.0001

1E- 05
0.0001

1E- 06
1E-05

1E- 07
1E-06
1E- 08
1E-07
1E- 09
1E-08
1E- 10
1E-09
1E- 11
1E-10

PSO average best over all runs PSO average best over all runs
Severity = 0.1 Severity = 0.5
10 dimensions 10 dimensions
10000
10000

1000
1000

100
Average best value over all runs

100
Average best value over all runs

10

1 10

0.1
1

0.01
0.1
0.001

0.01
0.0001

0.00001 0.001

0.000001
0.0001

8
PSO average best over all runs
Comparison of Results:
Severity = 1.0
10 dimensions Error Values Obtained in 2000 Evaluations
Severity 0.1 Severity 0.5
10000

1000

100
Angeline 5x10-4 – 10-3 0.01-
0.01-0.10
Average best value over all runs

10

1
Bäck 2x10-5 10-3
0.1

0.01
Eberhart & 10-10 - 10-9 10-9 – 10-8
Shi
0.001

0.0001

Example Application:
Conclusions and Future Efforts
Reactive Power and Voltage Control
z Our results, including those in 10 dimensions and z Japanese electric utility
with severity = 1, are promising z PSO used to determine control strategy
z We are applying approach to other benchmark z Continuous and discrete control variables
functions, and to practical logistics applications
z Hybrid binary/real-
binary/real-valued version of PSO developed
z System voltage stability achieved using a
continuation power flow technique

Scheduling System for Integrated


Automated Container Terminal Scheduling System for IACT – Workflow
Container
Reservations

• Objective - develop planning and scheduling algorithm for Yard Planning

fully integrated automated container terminals Container Sequence Container


Planning Yard

• Approach - Fuzzy system and evolutionary programming


Machine Planning
Machines
fuzzy reasoning
Container
Locations
facility state
Machine
Worklists

evolutionary
programming

Machine Operations

9
Container Planning Sequences More Examples of Recent Applications

z 500 Containers z Scheduling (Marine Corps logistics)


z Move from yard to z Manufacturing (Product content combination
staging area along the optimization)
berth z Figure of merit for electric vehicle battery pack
z Planning results
z Medical analysis/diagnosis (Parkinson’s
z Number of movements:
disease and essential tremor)
z Human performance prediction (cognitive and
physical)

Original Book Recent Book


z Title: Computational Intelligence PC
Tools z Title: Swarm Intelligence
z Authors: Eberhart, Dobbins, and z Authors: Kennedy, Eberhart and Shi
Simpson z Publisher: Morgan Kaufmann division
z Publisher: Academic Press of Academic Press
z Year published: 1996 z Publication date: 2001

New Book

Computational Intelligence: Concepts to


Implementations, Eberhart and Shi,
Morgan Kauffman, 2004.

10

You might also like