0% found this document useful (0 votes)
4 views108 pages

TEC-SS2025-02-foundations-annotations

Chapter 2 of 'Theory of Evolutionary Computation' by Dirk Sudholt introduces advanced Randomised Search Heuristics, discusses the No Free Lunch theorems, and recaps probability theory foundations. It outlines the historical development of Evolutionary Computation from various strands such as Evolutionary Programming, Genetic Algorithms, and Swarm Intelligence. The chapter emphasizes the importance of understanding specific problem structures to evaluate the effectiveness of different algorithms.

Uploaded by

bilal hoor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views108 pages

TEC-SS2025-02-foundations-annotations

Chapter 2 of 'Theory of Evolutionary Computation' by Dirk Sudholt introduces advanced Randomised Search Heuristics, discusses the No Free Lunch theorems, and recaps probability theory foundations. It outlines the historical development of Evolutionary Computation from various strands such as Evolutionary Programming, Genetic Algorithms, and Swarm Intelligence. The chapter emphasizes the importance of understanding specific problem structures to evaluate the effectiveness of different algorithms.

Uploaded by

bilal hoor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 108

Theory of Evolutionary Computation

Chapter 2: Theoretical Foundations of Evolutionary Computation

Dirk Sudholt
Aims for today

Introduce more advanced Randomised Search Heuristics


Discuss No Free Lunch (NFL) theorems
Recap foundations from probability theory
Give a first example of a runtime analysis
Introduce the fitness-level method as a simple analysis tool for proving upper
runtime bounds

Theory of Evolutionary Computation (Dirk Sudholt) Foundations 2 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:
Evolutionary Programming, 1962 (Lawrence J. Fogel, USA)
▶ evolving finite-state machines

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:
Evolutionary Programming, 1962 (Lawrence J. Fogel, USA)
▶ evolving finite-state machines

Evolution Strategies, 1964 (Ingo Rechenberg, Hans-Paul Schwefel, Germany)


▶ focus on continuous spaces Rd and mutation as variation operator

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:
Evolutionary Programming, 1962 (Lawrence J. Fogel, USA)
▶ evolving finite-state machines

Evolution Strategies, 1964 (Ingo Rechenberg, Hans-Paul Schwefel, Germany)


▶ focus on continuous spaces Rd and mutation as variation operator

Genetic Algorithms, 1960s (John H. Holland, USA)


▶ focus on binary spaces {0, 1}n and recombination as variation operator

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:
Evolutionary Programming, 1962 (Lawrence J. Fogel, USA)
▶ evolving finite-state machines

Evolution Strategies, 1964 (Ingo Rechenberg, Hans-Paul Schwefel, Germany)


▶ focus on continuous spaces Rd and mutation as variation operator

Genetic Algorithms, 1960s (John H. Holland, USA)


▶ focus on binary spaces {0, 1}n and recombination as variation operator

Genetic Programming, 1990s (John R. Koza, USA)


▶ evolving computer programs

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28


History of Evolutionary Computation
Evolutionary Computation emerged from four strands:
Evolutionary Programming, 1962 (Lawrence J. Fogel, USA)
▶ evolving finite-state machines

Evolution Strategies, 1964 (Ingo Rechenberg, Hans-Paul Schwefel, Germany)


▶ focus on continuous spaces Rd and mutation as variation operator

Genetic Algorithms, 1960s (John H. Holland, USA)


▶ focus on binary spaces {0, 1}n and recombination as variation operator

Genetic Programming, 1990s (John R. Koza, USA)


▶ evolving computer programs

Nowadays, EC is a subfield of AI/Computational Intelligence and an umbrella term for all


the above and further randomised search heuristics, including
estimation-of-distribution algorithms
simulated annealing
swarm intelligence
artificial immune systems
...
In Operational Research/Operational Management, these algorithms are also known as
metaheuristics.
Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 3 / 28
Swarm Intelligence

Collective behavior of a “swarm” of agents.

Examples from Nature

Photo by Mehmet Karatay.


Yewenyi at the English language Wikipedia

Plenty of inspiration for optimization.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 4 / 28


Ant Colony Optimization

Sketch by Johann Dréo.

Photo by Mehmet Karatay.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 5 / 28


Swarm Intelligence paradigms
Ant colony optimization (ACO) [Dorigo, 1992]
inspired by foraging behavior of ants
artificial ants construct solutions using pheromones
pheromones indicate attractiveness of solution component

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 6 / 28


Swarm Intelligence paradigms
Ant colony optimization (ACO) [Dorigo, 1992]
inspired by foraging behavior of ants
artificial ants construct solutions using pheromones
pheromones indicate attractiveness of solution component

Particle swarm optimization (PSO) [Kennedy and Eberhart, 1995]


mimics search of bird flocks and fish schools
particles “fly” through search space
each particle is attracted by own best position and best pos. of neighbors

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 6 / 28


Swarm Intelligence paradigms
Ant colony optimization (ACO) [Dorigo, 1992]
inspired by foraging behavior of ants
artificial ants construct solutions using pheromones
pheromones indicate attractiveness of solution component

Particle swarm optimization (PSO) [Kennedy and Eberhart, 1995]


mimics search of bird flocks and fish schools
particles “fly” through search space
each particle is attracted by own best position and best pos. of neighbors

And then the flood gates opened. . .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 6 / 28


Swarm Intelligence paradigms
Ant colony optimization (ACO) [Dorigo, 1992]
inspired by foraging behavior of ants
artificial ants construct solutions using pheromones
pheromones indicate attractiveness of solution component

Particle swarm optimization (PSO) [Kennedy and Eberhart, 1995]


mimics search of bird flocks and fish schools
particles “fly” through search space
each particle is attracted by own best position and best pos. of neighbors

And then the flood gates opened. . .

Everyone and their dog proposed metaheuristics based on . . .


African buffalos, algae, amoebas, Andean condors, ant lions, bacteria, barnacles (mating), bats, bees,
bumblebees, queen bees, beetles, big bang, biogeography, bisons, black holes, blind naked mole rats,
brainstorming, buses, butterflies, camels, cancers, cats, central force, charged systems, cheetahs, chemical
reactions, chicks, chicken swarms, clouds, cockroaches, colliding bodies, consultants, coral reefs, covid-19,
coyotes, crows, crystal energy, cuckoos, . . .

Source: http://fcampelo.github.io/EC-Bestiary/
Theory of Evolutionary Computation (Dirk Sudholt) Foundations History 6 / 28
Which Metaheuristic Paradigm is the Best?
Past views of metaheuristics (see sketch):
Metaheuristics are good across all problems.
Only beaten by specialised algorithms on few problems.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 7 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.
If f ∈ F , then all permutations of f -values are also functions in F .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.
If f ∈ F , then all permutations of f -values are also functions in F .

000 001 010 011 100 101 110 111


f1 ∈ F 0 1 2 3 4 5 6 7
f2 ∈ F 4 7 3 0 6 5 1 2
f3 ∈ F 6 2 5 7 0 4 1 3
.. .. .. .. .. .. .. ..
. . . . . . . .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.
If f ∈ F , then all permutations of f -values are also functions in F .

000 001 010 011 100 101 110 111


f1 ∈ F 0 1 2 3 4 5 6 7
f2 ∈ F 4 7 3 0 6 5 1 2
f3 ∈ F 6 2 5 7 0 4 1 3
.. .. .. .. .. .. .. ..
. . . . . . . .

No Free Lunch Theorem [Wolpert and Macready, 1997, Droste et al., 2002]
Consider search algorithms for functions f ∈ F where F is closed under permutations.
Let T (A) be the average number of different search points sampled by A before an
optimum is found (under the uniform distribution on F ). Then for any two search
algorithms A, B we have T (A) = T (B).

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.
If f ∈ F , then all permutations of f -values are also functions in F .

000 001 010 011 100 101 110 111


f1 ∈ F 0 1 2 3 4 5 6 7
f2 ∈ F 4 7 3 0 6 5 1 2
f3 ∈ F 6 2 5 7 0 4 1 3
.. .. .. .. .. .. .. ..
. . . . . . . .

No Free Lunch Theorem [Wolpert and Macready, 1997, Droste et al., 2002]
Consider search algorithms for functions f ∈ F where F is closed under permutations.
Let T (A) be the average number of different search points sampled by A before an
optimum is found (under the uniform distribution on F ). Then for any two search
algorithms A, B we have T (A) = T (B).

All search algorithms have the same average performance on F !

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


No Free Lunch Theorems
We will consider sets F of functions closed under permutations.
If f ∈ F , then all permutations of f -values are also functions in F .

000 001 010 011 100 101 110 111


f1 ∈ F 0 1 2 3 4 5 6 7
f2 ∈ F 4 7 3 0 6 5 1 2
f3 ∈ F 6 2 5 7 0 4 1 3
.. .. .. .. .. .. .. ..
. . . . . . . .

No Free Lunch Theorem [Wolpert and Macready, 1997, Droste et al., 2002]
Consider search algorithms for functions f ∈ F where F is closed under permutations.
Let T (A) be the average number of different search points sampled by A before an
optimum is found (under the uniform distribution on F ). Then for any two search
algorithms A, B we have T (A) = T (B).

All search algorithms have the same average performance on F !

Formal proof in Droste, Jansen, and Wegener [2002].

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 8 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !
There is no structure to F !

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !
There is no structure to F !
That’s why all search algorithms have the same performance.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !
There is no structure to F !
That’s why all search algorithms have the same performance.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !
There is no structure to F !
That’s why all search algorithms have the same performance.

Typical characteristics of realistic problems


Some degree of smoothness
Good search points are close to other good search points
(But we may get stuck in local optima.)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


Discussion of No Free Lunch
No Free Lunch Theorem considers all functions in a set F closed under
permutations.
▶ That is, if f ∈ F , then every other assignments of fitness values to search points is
also a function in F !
There is no structure to F !
That’s why all search algorithms have the same performance.

Typical characteristics of realistic problems


Some degree of smoothness
Good search points are close to other good search points
(But we may get stuck in local optima.)

Lessons learned from No Free Lunch


Disproves wrong claims that Algorithm A is better than Algorithm B on all problems.
But: the No Free Lunch scenario is not interesting.
To gain more insight, we have to study concrete problems!

Theory of Evolutionary Computation (Dirk Sudholt) Foundations NFL Theorem 9 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation


not true for multiple generations!

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation


not true for multiple generations!

Building-block hypothesis
Crossover is effective because it combines good ’building blocks’.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation


not true for multiple generations!

Building-block hypothesis
Crossover is effective because it combines good ’building blocks’.

Mitchell, Forrest, and Holland [1992]: “We designed a problem with building blocks on
which schema theory predicts: GAs outperform hill climbers.”

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation


not true for multiple generations!

Building-block hypothesis
Crossover is effective because it combines good ’building blocks’.

Mitchell, Forrest, and Holland [1992]: “We designed a problem with building blocks on
which schema theory predicts: GAs outperform hill climbers.”
Forrest and Mitchell [1993]: “We ran experiments and found out: hill climbers
outperform GAs.”

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28


How Useful is Crossover? Early theoretical approaches . . .
Schema Theory [Holland, 1975]
∗ ∗ 1 ∗ 0 0 ∗ ∗
Schemata with high fitness and few defining bits spread disproportionally.

true for one generation


not true for multiple generations!

Building-block hypothesis
Crossover is effective because it combines good ’building blocks’.

Mitchell, Forrest, and Holland [1992]: “We designed a problem with building blocks on
which schema theory predicts: GAs outperform hill climbers.”
Forrest and Mitchell [1993]: “We ran experiments and found out: hill climbers
outperform GAs.”

Conclusion
Need mathematical rigour – theorems and proofs!
Theory of Evolutionary Computation (Dirk Sudholt) Foundations Schemata Theory 10 / 28
Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

From 1997: Ingo Wegener and members of his Chair (Thomas Jansen, Stefan Droste) in
Dortmund
Collaborative Research Centre “Computational Intelligence” (12 years)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

From 1997: Ingo Wegener and members of his Chair (Thomas Jansen, Stefan Droste) in
Dortmund
Collaborative Research Centre “Computational Intelligence” (12 years)

Around 2000: Jun He and Xin Yao in Birmingham

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

From 1997: Ingo Wegener and members of his Chair (Thomas Jansen, Stefan Droste) in
Dortmund
Collaborative Research Centre “Computational Intelligence” (12 years)

Around 2000: Jun He and Xin Yao in Birmingham

From 2004: Frank Neumann at Kiel and then MPI Saarbrücken

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

From 1997: Ingo Wegener and members of his Chair (Thomas Jansen, Stefan Droste) in
Dortmund
Collaborative Research Centre “Computational Intelligence” (12 years)

Around 2000: Jun He and Xin Yao in Birmingham

From 2004: Frank Neumann at Kiel and then MPI Saarbrücken

From 2006: Benjamin Doerr and others at MPI Saarbrücken

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Brief History of Runtime Analysis

Heinz Mühlenbein, 1992: non-rigorous runtime analyses

Günter Rudolph, 1997: first rigorous runtime analyses

From 1997: Ingo Wegener and members of his Chair (Thomas Jansen, Stefan Droste) in
Dortmund
Collaborative Research Centre “Computational Intelligence” (12 years)

Around 2000: Jun He and Xin Yao in Birmingham

From 2004: Frank Neumann at Kiel and then MPI Saarbrücken

From 2006: Benjamin Doerr and others at MPI Saarbrücken

Now: research groups at Aberystwyth, Adelaide, Beijing, Birmingham, Copenhagen,


Dortmund, Leiden, Minnesota, Nanjing, Paderborn, Paris, Passau, Potsdam, Sheffield,
Singapore, Zurich, . . .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations History of Rigorous Analysis 11 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Expectations
P
E(X) = x Prob(X = x) · x

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Expectations
P
E(X) = x Prob(X = x) · x
P∞
If X only assumes values in N, E(X) = x=1 Prob(X ≥ x)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Expectations
P
E(X) = x Prob(X = x) · x
P∞
If X only assumes values in N, E(X) = x=1 Prob(X ≥ x)
Linearity of expectation: E(X + Y ) = E(X) + E(Y )

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations from Probability Theory

Treasure trove for foundations and methods: the book chapter by Doerr [2020].

Probabilities
Prob(A) is the probability of event A, Prob(A) = 1 − Prob(A)
Prob(A ∪ B) = Prob(A) + Prob(B) − Prob(A ∩ B) ≤ Prob(A) + Prob(B)
If A and B are independent, Prob(A ∩ B) = Prob(A) · Prob(B)
Prob(A∩B)
Conditional probabilities: Prob(A | B) = Prob(B)

Expectations
P
E(X) = x Prob(X = x) · x
P∞
If X only assumes values in N, E(X) = x=1 Prob(X ≥ x)
Linearity of expectation: E(X + Y ) = E(X) + E(Y )

Law of total probability: E(X) = E(X | A) · Prob(A) + E X | A · Prob(A)

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 12 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .
Binomial Bin(n, p): Prob(X = i) = ni pi (1 − p)n−i , E(X) = pn.


Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .
Binomial Bin(n, p): Prob(X = i) = ni pi (1 − p)n−i , E(X) = pn.


Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .
Binomial Bin(n, p): Prob(X = i) = ni pi (1 − p)n−i , E(X) = pn.


Inequalities
For all x ∈ R,
1 + x ≤ ex

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .
Binomial Bin(n, p): Prob(X = i) = ni pi (1 − p)n−i , E(X) = pn.


Inequalities
For all x ∈ R,
1 + x ≤ ex
For all n ∈ N,  n  n−1
1 1 1
1− ≤ ≤ 1−
n e n

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Foundations (2)

Common probability distributions, and their notations with parameters


1
Uniform Unif(S): Prob(x) = |S|

Bernoulli Ber(p): Prob(X = 1) = p = 1 − Prob(X = 0), E(X) = p.


Geometric Geom(p): Prob(X = i) = (1 − p)i−1 p, E(X) = p1 .
Binomial Bin(n, p): Prob(X = i) = ni pi (1 − p)n−i , E(X) = pn.


Inequalities
For all x ∈ R,
1 + x ≤ ex
For all n ∈ N,  n  n−1
1 1 1
1− ≤ ≤ 1−
n e n
Pn 1
Harmonic numbers H(n) := i=1 i :

ln(n) ≤ H(n) ≤ ln(n) + 1

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Mathematical foundations 13 / 28


Cracking Codes with Randomised Search Heuristics

Task
Find a hidden target string.

target ? ? ? ? ? ? ? ?
solution 1 1 1 1 0 0 1 0

Fitness: number of correct bits.


Theory of Evolutionary Computation (Dirk Sudholt) Foundations RLS on OneMax 14 / 28
Cracking Codes with Randomised Search Heuristics

Task Task
Find a hidden target string. Find the all-ones string.

target ? ? ? ? ? ? ? ? target 1 1 1 1 1 1 1 1
solution 1 1 1 1 0 0 1 0 solution 1 0 1 0 1 0 0 1

Fitness: number of correct bits. Fitness: number of 1-bits (OneMax).


Theory of Evolutionary Computation (Dirk Sudholt) Foundations RLS on OneMax 14 / 28
Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n
The expected waiting time (# draws) for this is

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n
The expected waiting time (# draws) for this is
n
.
n−i

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n
The expected waiting time (# draws) for this is
n
.
n−i
Summing up all these times gives an upper bound of

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n
The expected waiting time (# draws) for this is
n
.
n−i
Summing up all these times gives an upper bound of
n−1 n
X n X1
=n· ≤ n(ln(n) + 1).
i=0
n−i i=1
i

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


Collecting Coupons
RLS on OneMax is like trying to collect n coupons.

Worst case: assume we start with no coupons, i. e. 0n .

If we have collected i coupons, the probability of getting a new one is


n−i
.
n
The expected waiting time (# draws) for this is
n
.
n−i
Summing up all these times gives an upper bound of
n−1 n
X n X1
=n· ≤ n(ln(n) + 1).
i=0
n−i i=1
i

Theorem
The expected running time of RLS on OneMax is at most n ln(n) + n.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Coupon collector process 15 / 28


The (1+1) Evolutionary Algorithm

(1+1) EA for maximization of f : {0, 1}n → R


Choose x ∈ {0, 1}n uniformly at random.
repeat forever
Create y by flipping each bit in x independently with probability 1/n.
if f (y) ≥ f (x) then x := y.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 16 / 28


The (1+1) Evolutionary Algorithm

(1+1) EA for maximization of f : {0, 1}n → R


Choose x ∈ {0, 1}n uniformly at random.
repeat forever
Create y by flipping each bit in x independently with probability 1/n.
if f (y) ≥ f (x) then x := y.

Properties:
reflects basic principle of mutation and selection
stochastic hill climber
flips one bit in expectation
can mimic one step of RLS
can escape from local optima by flipping many bits

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 16 / 28


One Size Fits All. . .
Evolutionary algorithms like the simple (1+1) EA . . .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 17 / 28


One Size Fits All. . .
Evolutionary algorithms like the simple (1+1) EA . . .

. . . solve in expected poly-time


sorting (maximize sortedness) [Scharnow, Tinnefeld, Wegener, 2005][Baumann,
Rutter, Sudholt, 2024]
shortest paths [Scharnow, Tinnefeld, Wegener, 2004]
minimum spanning trees [Neumann and Wegener, 2007]
Matroid optimization [Reichel and Skutella, 2007]
Eulerian cycles [Neumann, 2008 and follow-up work]
3-SAT instances with planted optima [Doerr, Neumann, and Sutton, 2015]

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 17 / 28


One Size Fits All. . .
Evolutionary algorithms like the simple (1+1) EA . . .

. . . solve in expected poly-time


sorting (maximize sortedness) [Scharnow, Tinnefeld, Wegener, 2005][Baumann,
Rutter, Sudholt, 2024]
shortest paths [Scharnow, Tinnefeld, Wegener, 2004]
minimum spanning trees [Neumann and Wegener, 2007]
Matroid optimization [Reichel and Skutella, 2007]
Eulerian cycles [Neumann, 2008 and follow-up work]
3-SAT instances with planted optima [Doerr, Neumann, and Sutton, 2015]

. . . are poly-time randomised approximation schemes


maximum matchings [Giel and Wegener, 2003]
PARTITION/makespan scheduling [Witt, 2005]
multiobjective shortest paths [Horoba, 2010]

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 17 / 28


One Size Fits All (continued)

EAs can mimic behavior of


tailored algorithms
▶ Eulerian cycles: (1+1) EA mimics Hierholzer’s algorithm
▶ Maximum matchings: (1+1) EA finds augmenting paths
▶ Partition: (1+1) EA mimics Graham’s algorithm
▶ TSP: (1+1) EA mimics k-Opt algorithm
dynamic programming algorithms [Doerr, Eremeev, Horoba, Neumann, Theile, 2009]
greedy algorithms
fixed-parameter tractable algorithms [Kratsch and Neumann, 2009 and follow-on
work]

Theory of Evolutionary Computation (Dirk Sudholt) Foundations The (1+1) EA 18 / 28


Fitness-level Method for the (1+1) EA
A7
A6
A5
Pr((1+1) EA leaves Ai ) ≥ si

fitness
A4
A3
A2
A1

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Fitness-level method 19 / 28


Fitness-level Method for the (1+1) EA
A7
A6
A5
Pr((1+1) EA leaves Ai ) ≥ si

fitness
A4
A3
A2
A1

Definition (Fitness-based partitions)


For two sets A, B ⊆ {0, 1}n and fitness function f let A <f B if f (a) < f (b) for all
a ∈ A and all b ∈ B.
A fitness-based partition is a partition of the search space into non-empty sets
A1 , . . . , Am such that A1 <f A2 <f · · · <f Am and Am only contains global optima.
We say that the (1+1) EA is in Ai or on level i if its current search point is in Ai .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Fitness-level method 19 / 28


Fitness-level Method for the (1+1) EA
A7
A6
A5
Pr((1+1) EA leaves Ai ) ≥ si

fitness
A4
A3
A2
A1

Theorem (Fitness-level method for proving upper bounds)


Given a fitness-based partition A1 , . . . , Am , let si be a lower bound on the probability of
creating a new offspring in A≥i+1 := Ai+1 ∪ · · · ∪ Am , provided the (1+1) EA is in Ai .
Then the expected optimisation time on f is bounded by
m−1
X 1
i=1
si

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Fitness-level method 19 / 28


Proof of the fitness-level method for upper bounds
Fitness-Level Method: (1+1) EA on OneMax

Pn
OneMax (x) := i=1 xi

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on OneMax 22 / 28


Fitness-Level Method: (1+1) EA on OneMax

Pn
OneMax (x) := i=1 xi

Fitness-level partition: Ai := {x | OneMax(x) = i}.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on OneMax 22 / 28


Fitness-Level Method: (1+1) EA on OneMax

Pn
OneMax (x) := i=1 xi

Fitness-level partition: Ai := {x | OneMax(x) = i}.

Sufficient condition for leaving Ai : flip only one 0-bit.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on OneMax 22 / 28


Fitness-Level Method: (1+1) EA on LeadingOnes

Pn Qi
LO (x) := i=1 j=1 xj counts the number of leading ones (11101100).

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on LeadingOnes 23 / 28


Fitness-Level Method: (1+1) EA on LeadingOnes

Pn Qi
LO (x) := i=1 j=1 xj counts the number of leading ones (11101100).

Fitness-level partition: Ai := {x | LO(x) = i}.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on LeadingOnes 23 / 28


Jumpk : A Function With Tuneable Difficulty

Jumpk [Jansen and Wegener, 2002]: “jump” of k bits required.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on Jump 24 / 28


Jumpk : A Function With Tuneable Difficulty

Jumpk [Jansen and Wegener, 2002]: “jump” of k bits required.

optimum 1n
40

n − k 1-bits
Fitness 30

20

10

0
0 10 20 30
number of 1-bits

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on Jump 24 / 28


Jumpk : A Function With Tuneable Difficulty

Jumpk [Jansen and Wegener, 2002]: “jump” of k bits required.

optimum 1n
40

n − k 1-bits
Fitness 30

20

10

0
0 10 20 30
number of 1-bits
Take s0 , . . . , sn−1 as for OneMax and sn = (1/n)k (1 − 1/n)n−k ≥ 1/(enk ).

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on Jump 24 / 28


Jumpk : A Function With Tuneable Difficulty

Jumpk [Jansen and Wegener, 2002]: “jump” of k bits required.

optimum 1n
40

n − k 1-bits
Fitness 30

20

10

0
0 10 20 30
number of 1-bits
Take s0 , . . . , sn−1 as for OneMax and sn = (1/n)k (1 − 1/n)n−k ≥ 1/(enk ).

Expected optimisation time of (1+1) EA on Jumpk is O(n log n + nk ).

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: (1+1) EA on Jump 24 / 28


(1+1) EA Always Finds an Optimum

Theorem
(1+1) EA optimises every function in expected time at most nn .

Same for all EAs that use standard mutation operators.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: a general upper bound 25 / 28
(1+1) EA Always Finds an Optimum

Theorem
(1+1) EA optimises every function in expected time at most nn .

Same for all EAs that use standard mutation operators.

Fitness-level partition:

A0 = {0, 1}n \ OPT


A1 = OPT

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: a general upper bound 25 / 28
(1+1) EA Always Finds an Optimum

Theorem
(1+1) EA optimises every function in expected time at most nn .

Same for all EAs that use standard mutation operators.

Fitness-level partition:

A0 = {0, 1}n \ OPT


A1 = OPT

Worst case for A0 : all n bits have to flip. So


 n
1
s0 ≥
n

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: a general upper bound 25 / 28
(1+1) EA Always Finds an Optimum

Theorem
(1+1) EA optimises every function in expected time at most nn .

Same for all EAs that use standard mutation operators.

Fitness-level partition:

A0 = {0, 1}n \ OPT


A1 = OPT

Worst case for A0 : all n bits have to flip. So


 n
1
s0 ≥
n

and we get an upper bound of


0
X 1 1
= ≤ nn .
i=0
s0 s0

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Application: a general upper bound 25 / 28
Outlook
Methods for the analysis of RHSs
Fitness-level method (and extensions)
Drift analysis
Tail bounds, typical runs
Random walks

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 26 / 28


Outlook
Methods for the analysis of RHSs
Fitness-level method (and extensions)
Drift analysis
Tail bounds, typical runs
Random walks

Design aspects
How useful are populations?
How to ensure diversity within the population?
How important is recombination?
Parallel variants of evolutionary algorithms
Parameter control: how to learn good parameters

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 26 / 28


Outlook
Methods for the analysis of RHSs
Fitness-level method (and extensions)
Drift analysis
Tail bounds, typical runs
Random walks

Design aspects
How useful are populations?
How to ensure diversity within the population?
How important is recombination?
Parallel variants of evolutionary algorithms
Parameter control: how to learn good parameters

Focus will be on
Single-objective, discrete, fixed-size problems
Runtime analysis (other theories are available)
Illustrative, easy to describe problems that we can understand
Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 26 / 28
Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Runtime analysis
Seen a first runtime analysis: RLS cracks codes (optimises OneMax) on n bits in
expected time O(n log n).

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Runtime analysis
Seen a first runtime analysis: RLS cracks codes (optimises OneMax) on n bits in
expected time O(n log n).
The fitness-level method is a simple method for obtaining upper bounds for the
(1+1) EA.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Runtime analysis
Seen a first runtime analysis: RLS cracks codes (optimises OneMax) on n bits in
expected time O(n log n).
The fitness-level method is a simple method for obtaining upper bounds for the
(1+1) EA.
Runtime bounds for the (1+1) EA on OneMax, LeadingOnes and Jump

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


Conclusions

No Free Lunch theorems state that any two search heuristics have the same
performance
▶ But the No Free Lunch scenario is neither realistic nor interesting.
▶ Need to consider specific problem classes for meaningful results.
Early approaches (like Schema theory) were lacking rigour and led to false claims.
Reviewed foundations and tools from probability theory

Runtime analysis
Seen a first runtime analysis: RLS cracks codes (optimises OneMax) on n bits in
expected time O(n log n).
The fitness-level method is a simple method for obtaining upper bounds for the
(1+1) EA.
Runtime bounds for the (1+1) EA on OneMax, LeadingOnes and Jump
The runtime of EAs with standard bit mutation is bounded by nn .

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 27 / 28


References I
B. Doerr. Probabilistic tools for the analysis of randomized optimization heuristics. In B. Doerr and F. Neumann, editors, Theory of Evolutionary
Computation: Recent Developments in Discrete Optimization, pages 1–87. Springer, 2020.
M. Dorigo. Optimization, Learning and Natural Algorithms. PhD thesis, Politecnico di Milano, 1992.
S. Droste, T. Jansen, and I. Wegener. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276(1–2):51–81, 2002.
S. Forrest and M. Mitchell. Relative building block fitness and the building block hypotheses. In Proc. of FOGA 2, pages 198–226. Morgan Kaufmann, 1993.
J. H. Holland. Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, 1975.
T. Jansen and I. Wegener. On the analysis of evolutionary algorithms—a proof that crossover really can help. Algorithmica, 34(1):47–66, 2002.
J. Kennedy and R. C. Eberhart. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, pages 1942–1948.
IEEE Press, 1995.
M. Mitchell, S. Forrest, and J. H. Holland. The royal road function for genetic algorithms: fitness landscapes and GA performance. In Proc. of the 1st
European Conference on Artificial Life, pages 245–254. MIT Press, 1992.
D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67–82, 1997.

Theory of Evolutionary Computation (Dirk Sudholt) Foundations Outlook and conclusions 28 / 28

You might also like