AIOT-UNIT 3
AIOT-UNIT 3
Variation Operators: Mechanisms like mutation and crossover to generate diversity and
create new solutions.
Fitness Function: A function to evaluate and assign a quality score to individuals based on
how well they solve the problem.
Replacement Strategy: A method to decide which individuals survive to the next generation.
3. Define Elitism.
Elitism in evolutionary algorithms refers to a mechanism where the best-performing
individuals in a population are directly carried over to the next generation without
undergoing variation (e.g., crossover or mutation). This ensures that the best solutions found
so far are preserved and not lost due to random changes.
Example
Population: [Individual1(90),Individual2(80),Individual3(70),Individual4(60)][Individual1(90),
Individual2(80), Individual3(70), Individual4(60)]
[Individual1(90),Individual2(80),Individual3(70),Individual4(60)] (fitness scores in
parentheses).
Elitism: Retain the top 1 individual (Individual1(90)Individual1(90)Individual1(90)).
Next Generation: [Individual1(90),NewIndividuals][Individual1(90), New_Individuals]
[Individual1(90),NewIndividuals].
Here, Individual1Individual1Individual1 is directly preserved in the new population due to its
high fitness, ensuring the best solution isn't lost.
4. Briefly explain parent selection mechanism.
The parent selection mechanism in evolutionary algorithms is the process of choosing
individuals from the current population to serve as parents for producing offspring in the
next generation. This step is critical for guiding the algorithm toward better solutions
Common Methods
A. Roulette Wheel Selection: Parents are chosen probabilistically based on fitness. Higher
fitness means a higher probability of being selected.
B. Tournament Selection: A subset of individuals is randomly selected, and the one with
the highest fitness in the subset becomes a parent.
C. Rank-Based Selection: Individuals are ranked by fitness, and selection probability is
proportional to rank rather than raw fitness.
D. Stochastic Universal Sampling (SUS): Ensures a more uniform selection of parents by
distributing selection probabilities evenly across the population.
E. Random Selection: Parents are chosen randomly, regardless of fitness (used occasionally
to maintain diversity).
Key Idea
The parent selection mechanism balances exploitation (using fit individuals) and exploration
(allowing less fit individuals a chance to contribute).
1. Initialization:
2. Evaluation:
4. Reproduction:
5. Replacement:
o Form the next generation by replacing some or all of the current population with
offspring.
6. Termination:
o Stop the algorithm when a predefined stopping criterion is met (e.g., maximum
generations or acceptable fitness level).
1. Genetic Algorithm (GA): Uses binary or real-valued encoding for individuals and focuses on
crossover and mutation.
3. Genetic Programming (GP): Evolves tree-like structures, typically for symbolic regression or
program synthesis.
5. Evolutionary Programming (EP): Emphasizes mutation rather than crossover and is often
used for continuous optimization.
Initialization:
Random Population: A set of initial solutions (individuals) is generated
randomly.
Evaluation:
Selection:
Parent Selection: Individuals with higher fitness are more likely to be selected as
parents for the next generation.
Selection Methods:
Reproduction:
Two-Point Crossover: Two crossover points are chosen, and the genetic
material between these points is swapped.
Mutation:
Mutation Rate: The probability of mutation for each gene is typically low.
Replacement:
New Generation: The newly created offspring replace the least fit individuals in
the population.
Elitism: The fittest individuals from the current generation may be directly copied
to the next generation to preserve valuable genetic material.
The process is repeated iteratively until a termination condition is met, such as reaching
a maximum number of generations or a satisfactory fitness level.
Advantages
Disadvantages
2.Narrate in detail about Features of Evolutionary Computing, List some of the merits and demerits
of it.
1. Population-Based Search
3. Genetic Operators
4. Fitness Evaluation
o Each individual solution is evaluated using a fitness function that determines how
well it solves the problem. This guides the evolutionary process.
5. Adaptability
6. Domain Independence
o These algorithms do not require specific knowledge of the problem domain, making
them versatile and applicable to a wide range of problems.
7. Stochastic Nature
8. Parallelism
1. Robustness
2. Flexibility
o Reduces the risk of being trapped in local optima by exploring a broad solution
space.
4. Ease of Implementation
6. Parallel Computation
1. Computationally Expensive
o While EC is effective at finding good solutions, it does not guarantee finding the
global optimum.
3. Parameter Sensitivity
o Performance heavily depends on parameters like population size, mutation rate, and
crossover rate, which may require fine-tuning.
4. Slow Convergence
5. Randomness Dependence
o The stochastic nature of EC can lead to inconsistent results across runs, requiring
multiple trials.
Conclusion
Evolutionary Computing offers a powerful and flexible approach for solving optimization
problems, especially those that are complex or poorly defined. However, its
computational demands and reliance on careful parameter tuning can be limiting factors.
Understanding the trade-offs is crucial for its effective application in real-world
scenarios.
3.Elaborate in detail about Genetic Algorithms taking one real time example.
Genetic Algorithms(GAs) are adaptive heuristic search algorithms that belong to the larger part of
evolutionary algorithms. Genetic algorithms are based on the ideas of natural selection and genetics.
These are intelligent exploitation of random searches provided with historical data to direct the
search into the region of better performance in solution space. They are commonly used to generate
high-quality solutions for optimization problems and search problems.
Genetic algorithms simulate the process of natural selection which means those species that can
adapt to changes in their environment can survive and reproduce and go to the next generation. In
simple words, they simulate “survival of the fittest” among individuals of consecutive generations to
solve a problem. Each generation consists of a population of individuals and each individual
represents a point in search space and possible solution. Each individual is represented as a string of
character/integer/float/bits. This string is analogous to the Chromosome.
Genetic algorithms are based on an analogy with the genetic structure and behavior of
chromosomes of the population. Following is the foundation of GAs based on this analogy –
2. Those individuals who are successful (fittest) then mate to create more offspring than others
3. Genes from the “fittest” parent propagate throughout the generation, that is sometimes
parents create offspring which is better than either parent.
Search space
The population of individuals are maintained within search space. Each individual represents a
solution in search space for given problem. Each individual is coded as a finite length vector
(analogous to chromosome) of components. These variable components are analogous to Genes.
Thus a chromosome (individual) is composed of several genes (variable components).
Fitness Score
A Fitness Score is given to each individual which shows the ability of an individual to “compete”.
The individual having optimal fitness score (or near optimal) are sought.
The GAs maintains the population of n individuals (chromosome/solutions) along with their fitness
scores.The individuals having better fitness scores are given more chance to reproduce than others.
The individuals with better fitness scores are selected who mate and produce better offspring by
combining chromosomes of parents. The population size is static so the room has to be created for
new arrivals. So, some individuals die and get replaced by new arrivals eventually creating new
generation when all the mating opportunity of the old population is exhausted. It is hoped that over
successive generations better solutions will arrive while least fit die.
Each new generation has on average more “better genes” than the individual (solution) of previous
generations. Thus each new generations have better “partial solutions” than previous generations.
Once the offspring produced having no significant difference from offspring produced by previous
populations, the population is converged. The algorithm is said to be converged to a set of solutions
for the problem.
Operators of Genetic Algorithms
Once the initial generation is created, the algorithm evolves the generation using following operators
–
1) Selection Operator: The idea is to give preference to the individuals with good fitness scores and
allow them to pass their genes to successive generations.
2) Crossover Operator: This represents mating between individuals. Two individuals are selected
using selection operator and crossover sites are chosen randomly. Then the genes at these crossover
sites are exchanged thus creating a completely new individual (offspring). For example –
3) Mutation Operator: The key idea is to insert random genes in offspring to maintain the diversity in
the population to avoid premature convergence. For example –
Given a target string, the goal is to produce target string starting from a random string of the same
length. In the following implementation, following analogies are made –
Characters A-Z, a-z, 0-9, and other special symbols are considered as genes
A string generated by these characters is considered as chromosome/solution/Individual
Fitness score is the number of characters which differ from characters in target string at a particular
index. So individual having lower fitness value is given more preference.
Unlike traditional AI, they do not break on slight change in input or presence of noise
Mutation testing
Code breaking
PART C
Evolutionary computation represents a class of nature-inspired algorithms that mimic the process of
natural selection to solve complex optimization and search problems. Understanding the principles
and applications of evolutionary computation is essential for leveraging its potential in diverse
domains, from engineering to finance and beyond.
Evolutionary computation is based on the principles of natural selection, genetic inheritance, and
survival of the fittest, where candidate solutions evolve and improve over successive generations.
1. Genetic Algorithms (GA): Utilizing the concepts of selection, crossover, and mutation to
evolve a population of candidate solutions toward optimal or near-optimal solutions.
Differential Evolution (DE): Employing mutation and crossover operations on vectors of parameter
values to navigate the search space and converge to optimal solutions.
1. Optimization
o EC is widely used in solving optimization problems across engineering, business, and
science.
Example: Optimizing supply chain networks for cost reduction and efficiency
improvement.
2. Engineering Design
Example: Structural optimization for materials with high strength and low
weight.
3. Robotics
o Solving complex biological and medical problems like gene sequencing and drug
design.
7. Telecommunications
o Solves network optimization and resource allocation challenges.
8. Environmental Science
o NASA uses genetic algorithms to optimize antenna design for spacecraft, ensuring
minimal weight and maximal efficiency.
3. Drug Discovery
o Algorithms are used to search massive chemical spaces for molecules with desired
properties, speeding up drug development.
4. Automated Scheduling
5. Smart Grids
o Used to manage energy distribution and demand response, improving the efficiency
of smart grids.
1. Versatility
2. Robustness
4. Adaptability
6. Scalability
2. Parameter Tuning: The selection of algorithmic parameters and control settings can impact
the performance and convergence of evolutionary computation algorithms, requiring careful
tuning and optimization for different problem instances.
3. Convergence Speed: Balancing the trade-off between exploration and exploitation to achieve
faster convergence to high-quality solutions.