SC_Unit-2
SC_Unit-2
Evolutionary Computation
Concepts and Paradigms
Chapter 3: Outline
•Focus is on people:
* A broad sample, not exhaustive
* In chronological order for each field
Genetic Algorithms
A. S. Fraser
•Worked in 1950s in Australia
•Biologist using computers to simulate natural genetic systems
•Worked on epistasis (suppression of the effect of a gene)
•Used a 15-bit string as representation; 5 bits per parameter
•Selected “parents” according to “fitness function” values [-1,1]
•Didn’t consider applications to artificial systems
John Holland
•Has had more influence on GA field than any other person
•Has been at the Univ. of Michigan since the early 1960s
•Received first Ph.D. in Computer Science in the U. S. (under Arthur Burks)
•Holland “… created the genetic algorithm field …”
•Published and taught in the field of adaptive systems in 1960s
•Was a pioneer in the use of a population of individuals to do search
•Derived schema theorem which shows that more fit schema are more likely
to reproduce
•Used reproduction, crossover, and mutation as early as 1960s
•GA was originally called a “genetic plan”
J. D. Bagley
•Self-taught in GAs
•Used GAs for 2-D bin packing in chip layout at TI
•Edited “Handbook of Genetic Algorithms”
Tutorial
Case studies
Software diskette: OOGA (Lisp) and Genesis ©
Evolutionary Programming
Larry J. Fogel
•Developed evolutionary programming (EP) in 1960s
•EP utilizes survival of fittest (or “survival of more skillful”), but only
operation on structures is mutation
•Interests in finite state machines and machine intelligence
•EP “… abstracts evolution as a top-down process of adaptive behavior,
rather than a bottom-up process of adaptive genetics …”
•1960s book was controversial
•EP suffered from lack of funding due to numerics versus symbolics
controversy
•David Fogel and others now carrying on the work
Evolution Strategies
I. Rechenberg
•Compare with other techniques - see how EC fits in with other approaches
Definition: Evolutionary
Computation
•Mostly optimization
non-differentiable
many local optima
may not know optimum
system may be dynamic, changing with time, or even chaotic
•Overview
•Terminology
•Review of GA operations
•Fitness evaluation
•Copying strings
•Randomize population
•Now have first generation, with total fitness of 6.313, and two
individuals each with fitness > .99
•Representation of variables
•Population size
•Population initialization
•Fitness calculation
•Reproduction
•Crossover
•Inversion
•Mutation
•Selecting number of generations
Representation of Variables
Consider example problem,
where 127 is 01111111 and 128 is 10000000
To get Gray code from binary code, the leftmost bit is the same, then
Gi = XOR(Bi, Bi-1) for i>=2, where Gi is the ith Gray code bit, Bi is ith binary bit.
Dynamic Range and Resolution
•Often start with relatively high crossover rate, and reduce it during
the run
c=3
n=7
Inversion
Can use 9 bits per state; 36 bits total per population member
Payoff Function
Player 1
C D
Player C 3|3 0|5
2 D 5|0 1|1
(Format is Player2|Player1)
A 7-State Finite State Machine
Function Optimization
•Procedure:
* Set population size (=50 for this problem)
* Initialize values of variables over [-5,5]
* Calculate fitness values (1/Euclidean_dist_fm_origin)
* Mutate each parent to produce one child
EP Mutation Process
Function Optimization - Mutation
x new
i x old
A,i C x old
B ,i x old
A,i (Often, C=0.5)
Selection Methods
1.Initialize population
2.Perform recombination using the parents to form children
3. Mutate all children
4.Evaluate or population members
5. Select fittest for new population
6. If termination criteria not met, go to step 2
Genetic Programming
•Genetic programming (GP) evolves hierarchical computer programs
•The goal is to obtain the desired output for a given set of inputs,
within the search domain
Differences between GPs and Generic
GAs
Population members of GPs are executable structures (generally,
computer programs) rather than strings of bits and/or variables
•Control parameters
* Population size
* Max. no. of generations
* Reproduction probability
* Crossover probability
* Max. depth allowed (initial and final)
Termination of GP
Full method - Each limb extends for full allowed depth. Only
functions are selected until max depth is reached, then only
terminals are selected.
Ramped Half-and-Half Approach
* Within each depth, 1/2 of the programs are built using grow
approach, 1/2 using full approach
GP Fitness
•Pick one point randomly in each parent for crossover - point can be
anywhere in program
•Simple in concept
•Easy to implement
•Computationally efficient
•Paradigm simplified
Particle Swarm Optimization
Process
5. Go to step 2
PSO Velocity Update Equations
•Function optimization
•De Jong’s test set
•Schaffer’s f6 function
•Neural network training
•XOR
•Fisher’s iris data
•EEG data
•2500-pattern SOC test set
•Benchmark tests
•Compare gbest and lbest
•Vary neighborhood in lbest
Schaffer’s F6 Function
VMAX
vid w vid c1rand () pid xid c2 Rand () pgd xid
xid xid vid
Constriction F. 552 96
Vmax=100K
Constriction F. 530 78
Vmax=Xmax
Rosenbrock Function
IUPUI
Fuzzy Adaptive Inertia Weight: A Preview
IUPUI
Outline
IUPUI
Constriction Factor Version
where c1 c 2 , 4
( was set to 4.1, so K = .729)
IUPUI
Dynamic System Types
IUPUI
Practical Application Requirements
IUPUI
Experimental Design
IUPUI
PSO average best over all runs
Severity = 0.5
Three dimensions
PSO average best over all runs
Severity = 0.1
Three dimensions
10000
1000
100
10
Average best value over all runs
0.1
0.01
0.001
0.0001
1E-05
1E-06
1E-07
1E-08
1E-09
1E-10
1E-11
PSO average best over all runs
Severity = 0.1
10 dimensions
10000
1000
100
Average best value over all runs
10
0.1
0.01
0.001
0.0001
0.00001
0.000001
PSO average best over all runs
Severity = 0.5
10 dimensions
10000
1000
Average best value over all runs
100
10
0.1
0.01
0.001
0.0001
PSO average best over all runs
Severity = 1.0
10 dimensions
10000
1000
100
Average best value over all runs
10
0.1
0.01
0.001
0.0001
Comparison of Results:
Error Values Obtained in 2000
Evaluations
Severity 0.1 Severity 0.5
IUPUI