0% found this document useful (0 votes)
40 views13 pages

Unit 0.4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views13 pages

Unit 0.4

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Evolutionary computing

Evolutionary computing is a branch of soft computing that draws inspiration from biological evolution and natural
selection processes to solve optimization and search problems. It encompasses a variety of computational techniques,
including genetic algorithms, genetic programming, evolutionary strategies, and evolutionary programming. Here's an
introduction to evolutionary computing:

Basic Concept:

Evolutionary computing algorithms simulate the process of natural selection to evolve solutions to complex problems.
They maintain a population of candidate solutions (individuals), subject them to selection, crossover, and mutation
operators, and iteratively improve them over generations.

Key Components:

1. Population: A set of candidate solutions represented as individuals in the search space.

2. Fitness Function: A measure of the quality or suitability of each individual solution. It quantifies how well an individual
performs with respect to the problem being solved.

3. Selection: Mechanisms to choose individuals from the population for reproduction based on their fitness. Individuals
with higher fitness values are more likely to be selected.

4. Crossover (Recombination): A genetic operator that combines genetic material from two parent individuals to create
offspring. It promotes the exchange of information between individuals.

5. Mutation: A genetic operator that introduces random changes to individuals to maintain diversity in the population
and explore new regions of the search space.

Main Techniques:
1. **Genetic Algorithms (GAs)**: GAs use the principles of natural selection and genetics to evolve solutions. They
operate on a population of candidate solutions represented as chromosomes (typically binary strings) and apply
selection, crossover, and mutation operators to iteratively improve the solutions.

2. **Genetic Programming (GP)**: GP extends GAs to evolve computer programs or symbolic expressions represented
as trees. It evolves a population of programs through genetic operations such as crossover and mutation.

3. **Evolutionary Strategies (ES)**: ES focuses on optimizing real-valued parameters. It employs self-adaptive


mechanisms to adapt the step sizes of mutations during evolution.

4. **Evolutionary Programming (EP)**: EP emphasizes self-adaptation and direct encoding of parameters. It evolves a
population of fixed-length parameter vectors using mutation operators.

Applications:
1. **Optimization**: Evolutionary computing is widely used to solve optimization problems in various domains,
including engineering, finance, logistics, and scheduling.

2. **Machine Learning**: It has applications in feature selection, parameter optimization, and evolutionary-based
learning algorithms.

3. **Robotics**: Evolutionary algorithms are employed in robotics for path planning, robot control, and robot design
optimization.
4. **Bioinformatics**: They are used for sequence alignment, protein folding prediction, and gene regulatory network
inference.

5. **Data Mining**: Evolutionary algorithms can be applied to clustering, classification, and association rule mining
tasks.

Overview Evolutionary computing

Evolutionary computing, a subset of soft computing, encompasses a family of computational techniques


inspired by biological evolution and natural selection. These methods employ population-based search and
optimization algorithms to find solutions to complex problems across various domains. Here's an overview of
evolutionary computing:

Principles:

1. Inspiration from Biology: Evolutionary computing draws inspiration from the principles of biological
evolution, such as natural selection, mutation, recombination, and survival of the fittest.
2. Population-based Search: Unlike traditional optimization methods that operate on single solutions,
evolutionary computing maintains a population of candidate solutions (individuals) that evolve over
generations through genetic operators.
3. Iterative Improvement: Evolutionary algorithms iteratively improve the population of solutions by
applying selection, crossover (recombination), and mutation operators to generate new candidate
solutions.

Main Techniques:

1. Genetic Algorithms (GAs): GAs are one of the most popular evolutionary computing techniques. They
operate on a population of potential solutions represented as chromosomes, typically binary strings.
GAs use selection, crossover, and mutation operators to evolve solutions towards optimal or near-
optimal solutions.
2. Genetic Programming (GP): GP extends the principles of GAs to evolve computer programs or
symbolic expressions represented as trees. It evolves populations of programs through genetic
operations such as crossover and mutation, aiming to optimize their performance on a given task.
3. Evolutionary Strategies (ES): ES focuses on optimizing real-valued parameters. It employs self-
adaptive mechanisms to adapt the step sizes of mutations during evolution, making it well-suited for
continuous optimization problems.
4. Evolutionary Programming (EP): EP emphasizes self-adaptation and direct encoding of parameters. It
evolves populations of fixed-length parameter vectors using mutation operators, allowing it to handle
various optimization tasks.

Workflow:

1. Initialization: A population of candidate solutions is initialized randomly or using heuristic methods.


2. Evaluation: Each individual in the population is evaluated using a fitness function, which quantifies the
quality or suitability of the solution with respect to the problem being solved.
3. Selection: Individuals are selected from the population based on their fitness values, with higher fitness
individuals more likely to be selected for reproduction.
4. Reproduction (Crossover and Mutation): Selected individuals undergo reproduction through genetic
operators such as crossover and mutation to generate offspring, introducing diversity and exploration in
the population.
5. Replacement: Offspring replace individuals in the population based on selection criteria, maintaining
the population size.
6. Termination Criterion: The evolutionary process continues iteratively until a termination criterion is
met, such as reaching a maximum number of generations or achieving a satisfactory solution.
Genetic Algorithms
Last Updated : 08 Mar, 2024



Genetic Algorithms(GAs) are adaptive heuristic search algorithms that belong to the larger part of evolutionary
algorithms. Genetic algorithms are based on the ideas of natural selection and genetics. These are intelligent
exploitation of random searches provided with historical data to direct the search into the region of better
performance in solution space. They are commonly used to generate high-quality solutions for optimization
problems and search problems.
Genetic algorithms simulate the process of natural selection which means those species that can adapt to changes
in their environment can survive and reproduce and go to the next generation. In simple words, they simulate
“survival of the fittest” among individuals of consecutive generations to solve a problem. Each generation consists
of a population of individuals and each individual represents a point in search space and possible solution. Each
individual is represented as a string of character/integer/float/bits. This string is analogous to the Chromosome.
Foundation of Genetic Algorithms
Genetic algorithms are based on an analogy with the genetic structure and behavior of chromosomes of the
population. Following is the foundation of GAs based on this analogy –
1. Individuals in the population compete for resources and mate
2. Those individuals who are successful (fittest) then mate to create more offspring than others
3. Genes from the “fittest” parent propagate throughout the generation, that is sometimes parents create offspring
which is better than either parent.
4. Thus each successive generation is more suited for their environment.
Search space
The population of individuals are maintained within search space. Each individual represents a solution in search
space for given problem. Each individual is coded as a finite length vector (analogous to chromosome) of
components. These variable components are analogous to Genes. Thus a chromosome (individual) is composed of
several genes (variable components).

Fitness Score
A Fitness Score is given to each individual which shows the ability of an individual to “compete”. The individual
having optimal fitness score (or near optimal) are sought.

The GAs maintains the population of n individuals (chromosome/solutions) along with their fitness scores.The
individuals having better fitness scores are given more chance to reproduce than others. The individuals with better
fitness scores are selected who mate and produce better offspring by combining chromosomes of parents. The
population size is static so the room has to be created for new arrivals. So, some individuals die and get replaced by
new arrivals eventually creating new generation when all the mating opportunity of the old population is exhausted.
It is hoped that over successive generations better solutions will arrive while least fit die.

Each new generation has on average more “better genes” than the individual (solution) of previous generations.
Thus each new generations have better “partial solutions” than previous generations. Once the offspring produced
having no significant difference from offspring produced by previous populations, the population is converged. The
algorithm is said to be converged to a set of solutions for the problem.

Operators of Genetic Algorithms


Once the initial generation is created, the algorithm evolves the generation using following operators –
1) Selection Operator: The idea is to give preference to the individuals with good fitness scores and allow them to
pass their genes to successive generations.
2) Crossover Operator: This represents mating between individuals. Two individuals are selected using selection
operator and crossover sites are chosen randomly. Then the genes at these crossover sites are exchanged thus
creating a completely new individual (offspring). For example –

3) Mutation Operator: The key idea is to insert random genes in offspring to maintain the diversity in the
population to avoid premature convergence. For example –

The whole algorithm can be summarized as –


1) Randomly initialize populations p
2) Determine fitness of population
3) Until convergence repeat:
a) Select parents from population
b) Crossover and generate new population
c) Perform mutation on new population
d) Calculate fitness for new population

Genetic algorithms
(GAs) are a type of evolutionary algorithm used in soft computing for optimization problems. They are inspired by the
process of natural selection and genetics and are particularly effective in solving complex optimization problems that are
difficult to solve using traditional methods. Here are some key points about genetic algorithms and optimization in soft
computing:

Key Points:
1. Genetic Algorithm Basics:
 A genetic algorithm is a metaheuristic that uses principles of natural selection and genetics to search for optimal solutions
to optimization problems.
 It involves a population of candidate solutions that evolve over generations through the application of genetic operators
such as mutation, crossover, and selection .

2. Optimization in Soft Computing:


 Optimization is a fundamental problem in soft computing, and genetic algorithms are a powerful tool for solving complex
optimization problems.
 Soft computing is a field that combines techniques from computer science, mathematics, and engineering to solve
complex problems that are difficult to solve using traditional methods .

3. Genetic Algorithm for Optimization:


 Genetic algorithms are used to optimize a wide range of problems, including continuous and discrete optimization, multi-
objective optimization, and constrained optimization.
 They are particularly effective in solving problems that are difficult to solve using traditional methods, such as those with
multiple local optima or those that require a large number of evaluations .

4. Advantages of Genetic Algorithms:


 Genetic algorithms are robust and can handle noisy or incomplete data.
 They are adaptive and can adjust to changing problem conditions.

 They can handle large search spaces and are effective in solving problems with multiple local optima .

5. Applications of Genetic Algorithms:


 Genetic algorithms have been applied to a wide range of fields, including engineering, economics, finance, and computer
science.
 They are used to optimize complex systems, such as neural networks and fuzzy systems, and to solve problems in areas
such as image processing and signal processing

The Schema Theorem:-

is a fundamental concept in soft computing and genetic algorithms. It states that the frequency of short, low-order
schemata with above-average fitness increases exponentially through generations. This theorem was proposed by John
Holland in the 1970s and is considered a key component of the building block hypothesis, which suggests that genetic
algorithms are successful because they identify and recombine low-order, low-defining-length schemata with above-
average fitness.

Key Points:
1. Schema Definition:
 A schema is a template that identifies a collection of strings that share similarities at specific locations.
 It is a subset of cylinder sets and constitutes a topological space.

2. Schema Theorem:
 The schema theorem states that the frequency of short, low-order schemata with above-average fitness increases
exponentially through generations.
 This theorem is based on the idea that genetic algorithms are successful because they identify and recombine low-order,
low-defining-length schemata with above-average fitness.

3. Building Block Hypothesis:


 The building block hypothesis suggests that low-order, low-defining-length schemata with above-average fitness serve as
a foundation for the success of genetic algorithms.
 These building blocks are recombined to form higher-order schemata with above-average fitness, leading to the
exponential increase in frequency of these schemata.

4. Implications:
 The schema theorem has significant implications for the design and implementation of genetic algorithms.
 It suggests that genetic algorithms should focus on identifying and recombining low-order, low-defining-length schemata
with above-average fitness to achieve optimal solutions

Genetic algorithm fundamental


Before beginning a discussion on Genetic Algorithms, it is essential to be familiar with some basic
terminology which will be used throughout this tutorial.

 Population − It is a subset of all the possible (encoded) solutions to the given problem. The
population for a GA is analogous to the population for human beings except that instead of
human beings, we have Candidate Solutions representing human beings.
 Chromosomes − A chromosome is one such solution to the given problem.
 Gene − A gene is one element position of a chromosome.
 Allele − It is the value a gene takes for a particular chromosome.

 Genotype − Genotype is the population in the computation space. In the computation space,
the solutions are represented in a way which can be easily understood and manipulated using
a computing system.
 Phenotype − Phenotype is the population in the actual real world solution space in which
solutions are represented in a way they are represented in real world situations.
 Decoding and Encoding − For simple problems, the phenotype and genotype spaces are
the same. However, in most of the cases, the phenotype and genotype spaces are different.
Decoding is a process of transforming a solution from the genotype to the phenotype space,
while encoding is a process of transforming from the phenotype to genotype space.
Decoding should be fast as it is carried out repeatedly in a GA during the fitness value
calculation.
For example, consider the 0/1 Knapsack Problem. The Phenotype space consists of solutions
which just contain the item numbers of the items to be picked.
However, in the genotype space it can be represented as a binary string of length n (where n
is the number of items). A 0 at position x represents that xth item is picked while a 1
represents the reverse. This is a case where genotype and phenotype spaces are different.

 Fitness Function − A fitness function simply defined is a function which takes the solution
as input and produces the suitability of the solution as the output. In some cases, the fitness
function and the objective function may be the same, while in others it might be different
based on the problem.
 Genetic Operators − These alter the genetic composition of the offspring. These include
crossover, mutation, selection, etc.
Basic Structure

The basic structure of a GA is as follows −

We start with an initial population (which may be generated at random or seeded by other
heuristics), select parents from this population for mating. Apply crossover and mutation operators
on the parents to generate new off-springs. And finally these off-springs replace the existing
individuals in the population and the process repeats. In this way genetic algorithms actually try to
mimic the human evolution to some extent.

Each of the following steps are covered as a separate chapter later in this tutorial.

A generalized pseudo-code for a GA is explained in the following program −

GA()
initialize population
find fitness of population

while (termination criteria is reached) do


parent selection
crossover with probability pc
mutation with probability pm
decode and fitness calculation
survivor selection
find best
return best

operators used in Genetic Algorithms:-


\
The query is about the operators used in Genetic Algorithms (GAs) in soft computing. Here are the key points about the
operators:

1. Crossover (Recombination):
 Crossover is the process of taking two parent solutions and producing a child.
 It is applied to the mating pool with the hope that it creates a better offspring.
 There are different types of crossover techniques:
 Single-Point Crossover: The two mating chromosomes are cut once at corresponding points and the sections after the
cuts are exchanged.
 Two-Point Crossover: Two crossover points are chosen and the contents between these points are exchanged between
two mated parents.

2. Inversion:
 Inversion operator inverts the bits between two random sites.
 It is used to introduce diversity in the population.

3. Deletion:
 Deletion operator involves deleting bits between two random sites.
 There are two types of deletion:
 Deletion and Duplication: Any two or three bits in random are selected and their previous bits are duplicated.
 Deletion and Regeneration: Bits between the cross site are deleted and regenerated randomly.

4. Mutation:
 Mutation operator involves flipping a bit, changing 0 to 1 and vice versa.
 It is used to prevent the algorithm from being trapped in a local minimum and to maintain diversity in the population .

5. Selection:
 Selection operator is used to select the fittest individuals from the population.
 There are different types of selection techniques:
 Roulette Wheel Selection: Each individual is assigned a fitness score, and the selection is based on these scores.
 Boltzmann Selection: The selection is based on the probability of selection, which is proportional to the fitness score.

6. Fitness Function:
 The fitness function is used to evaluate the fitness of each individual in the population.
 It is typically the value of the objective function in the optimization problem being addressed.

7. Population:
 The population is the set of individuals that are evolved through generations.
 It is typically initialized with a random set of individuals.

8. Generation:
 A generation is a term used to describe the population in each iteration of the evolution.
 Each generation is created by stochastically selecting the fittest individuals from the current population, recombining their
genomes, and introducing random mutations.

9. Termination Criteria:
 The algorithm typically ends when the population has reached a desirable fitness level or the maximum number of
generations has been produced.
These operators are used in combination to evolve the population towards better solutions.

GA (GENETIC ALGORITHM):
GA (Genetic Algorithm) is good at taking larger, potentially huge search space and
navigating them looking for optimal solution which we might not find in lifetime. GA is better
than other traditional algorithm in that they are more robust. They do not break easily even
if the inputs are changed slightly or in the presence of reasonable noise.
GA is used to resolve complicated optimization problems, such as , organizing the time
table, scheduling job shop, playing games. The concept of GA is directly derived from
natural evolution and heredity i.e. inheritance, where child inherits the characters (stored in
the chromosomes) from the parent.
Operators in Genetic algorithm :-
1.Crossover (Recombination):-
Crossover is the process of taking two parent solutions and producing from them a child.
After the selection (reproduction) process, the population is enriched with better individuals.
Crossover operator is applied to the mating pool with the hope that it creates a better
offspring.
The various crossover techniques are-
i).Single-Point Crossover-Here the two mating chromosomes are cut once at
corresponding points and the sections after the cuts exchanged.

ii). Two-Point Crossover-Here two crossover points are chosen and the contents between
these points are exchanged between two mated parents.

2. Inversion:-
Inversion operator inverts the bits between two random sites.
01 0011 1
Then, 0111001
3. Deletion:-
i).Deletion and duplication-Here any two or three bits in random are selected and their
previous bits are duplicated.
before duplication: 00 1001 0
deletion: 00 10_ _ 0
duplication: 00 1010 0
ii). Deletion and regeneration-Here bits between the cross site are deleted and
regenerated randomly.
10 0110 1
10_ _ _ _ 1
10 1101 1
4. Mutation:-
After crossover, the strings are subjected to mutation. Mutation prevents the algorithm to
be trapped in a local minimum. It plays the role of recovering the genetic materials as well
as for randomly distributing genetic information. It helps escape from local minima’s trap
and maintain diversity in the population. Mutation of a bit involves flipping a bit, changing 0
to 1and vice-versa.

The integration of genetic algorithms (GAs) with neural networks (NNs):-

The integration of genetic algorithms (GAs) with neural networks (NNs) in soft computing involves combining the
strengths of both techniques to optimize and enhance the performance of neural network-based systems. Here are some
key points about the integration of GAs with NNs:

1. Genetic Algorithm Operators:


 Selection: Roulette wheel selection and Boltzmann selection are used to select the fittest individuals from the population.
 Crossover (Recombination): Single-point crossover and two-point crossover are used to combine the genetic
information of two parent individuals.
 Mutation: Random mutation is used to introduce new genetic variations in the population.
 Inversion: Inversion operator is used to invert the bits between two random sites.
2. Neural Network Operators:
 Weight Initialization: Genetic algorithms can be used to initialize the weights of a neural network.
 Hyperparameter Tuning: Genetic algorithms can optimize hyperparameters such as learning rates, batch sizes, dropout
rates, and weight initialization schemes for neural networks.
 Neuroevolution: Genetic algorithms can evolve neural network weights and biases directly.

3. Integration:
 Architecture Search: Genetic algorithms can help search for the optimal architecture of a neural network.
 Feature Selection: Genetic algorithms can be used to select the most relevant features or inputs for a neural network.
 Ensemble Learning: Genetic algorithms can create an ensemble of neural networks with diverse architectures or
initializations.
 Transfer Learning: Genetic algorithms can optimize the transfer of knowledge from pre-trained neural networks to new
tasks.
 Neural Network Optimization: Genetic algorithms can optimize the weights and biases of a neural network to fine-tune
its performance for specific tasks.

4. Applications:
 Financial Early Warning System: Genetic algorithm and neural network (GNN) are integrated to build a financial early
warning system.
 Traveling Salesperson Problem (TSP): Genetic algorithms can be used to efficiently address the TSP.

5. Advantages:
 Global Search: Genetic algorithms can perform a global search of the solution space, which can help in finding the
optimal solution.
 Efficient Exploration: Genetic algorithms can efficiently explore large solution spaces, making them particularly valuable
for tasks like parameter optimization, search, and machine learning model selection.
 Diversity: Genetic algorithms can maintain diversity in the population, which can help in avoiding local optima and
finding better solutions.

6. Challenges:
 Complexity: The integration of GAs with NNs can be complex and require careful tuning of parameters.
 Computational Cost: The computational cost of integrating GAs with NNs can be high, especially for large-scale
problems.

7. Future Directions:
 Hybrid Approaches: Hybrid approaches that combine GAs with other optimization techniques, such as particle swarm
optimization or simulated annealing, can be explored.
 Large-Scale Problems: Genetic algorithms can be used to solve large-scale problems, such as those involving high-
dimensional or non-convex parameter spaces.

8. Conclusion:
 Integration of GAs with NNs: The integration of genetic algorithms with neural networks can help in optimizing and
enhancing the performance of neural network-based systems.
 Advantages and Challenges: The integration of GAs with NNs has several advantages, including global search, efficient
exploration, and diversity, but also challenges, such as complexity and computational cost.
 Future Directions: Future directions include hybrid approaches, large-scale problems, and exploring the potential of GAs
in other domains.

The integration of genetic algorithms (GAs) with fuzzy logic :-

The integration of genetic algorithms (GAs) with fuzzy logic in soft computing involves combining the
strengths of both techniques to optimize and enhance the performance of fuzzy logic-based systems. Here are
some key points about the integration of GAs with fuzzy logic:
1. Fuzzy Genetic Algorithms (FGAs):
 FGAs combine fuzzy logic and genetic algorithms to improve the performance of GAs.
 Fuzzy logic is used to handle imprecise variables between 0 and 1, while GAs use techniques like selection,
crossover and mutation to evolve solutions.

2. Integration Approaches:
 There are two main ways to integrate fuzzy logic and GAs:
1. Using GAs to optimize the parameters of a fuzzy logic system
2. Using fuzzy logic to enhance the performance of GAs

3. Optimizing Fuzzy Logic Systems with GAs:


 GAs can be used to optimize the membership functions, rules, and other parameters of a fuzzy logic system.
 This can help in automating the design of fuzzy logic systems and improving their performance.

4. Enhancing GAs with Fuzzy Logic:


 Fuzzy logic can be used to enhance the performance of GAs by:
 Handling imprecise variables in the GA
 Improving the selection, crossover, and mutation operators
 Handling constraints and multi-objective optimization problems

5. Applications:
 FGAs have been applied to a wide range of problems in engineering, economics, and other domains.
 Examples include optimization problems, control systems, and decision support systems.

6. Advantages:
 FGAs can handle imprecise and uncertain information better than traditional GAs.
 They can improve the performance of GAs by handling constraints and multi-objective optimization problems
more effectively.

7. Challenges:
 Integrating fuzzy logic and GAs can be complex and require careful tuning of parameters.
 The computational cost of FGAs can be higher than traditional GAs, especially for large-scale problems.

8. Future Directions:
 Exploring new ways to integrate fuzzy logic and GAs, such as using fuzzy logic to guide the search process of
GAs.
 Applying FGAs to solve large-scale and real-world problems in various domains.

9. Conclusion:
 The integration of genetic algorithms with fuzzy logic can enhance the performance of fuzzy logic-based
systems and improve their ability to handle imprecise and uncertain information.
 FGAs have been successfully applied to a wide range of problems and have shown promising results.
 However, there are still challenges in terms of complexity and computational cost that need to be addressed.
Share

issues related to GAs: -

Genetic Algorithms (GAs) are a type of optimization technique used to find the best solution among a set of possible
solutions. Here are some key issues related to GAs:
1. Premature Convergence: GAs can sometimes stop searching for better solutions too early, missing the best solution.
2. Parameter Tuning: The performance of GAs depends on several parameters, such as population size, crossover rate, and
mutation rate. Finding the right combination of these parameters can be challenging.
3. Computational Complexity: GAs can be computationally expensive, especially for large-scale problems.
4. Representation and Encoding: The way the problem is represented and encoded can significantly impact the
performance of GAs.
5. Diversity Maintenance: Maintaining diversity in the population is important to prevent premature convergence.
6. Hybridization: Combining GAs with other optimization techniques can improve their performance.
7. Theoretical Analysis: Understanding the behavior and convergence properties of GAs is an active area of research.
8. Real-World Applications: Applying GAs to real-world problems can be challenging due to issues such as noisy or
incomplete data, constraints, and multi-objective optimization.

You might also like