Unit 0: Parameter Estimation and Tuning
Unit 1: Exact Optimization Techniques
Chapter 1: Dynamic Programming
Chapter 2: Branch and Bound
Chapter 3: Gomory’s Cutting Plane Method
Chapter 4: Branch and Cut Algorithm
Chapter 5: Column Generation Technique
Chapter 6: Branch and Price Algorithm
Unit 2: Heuristics
Chapter 7: Greedy Algorithms
Chapter 8: Local Search Algorithms
Chapter 9: Constructive Heuristics
Chapter 10: Randomized Heuristics
Chapter 11: Nearest Neighbor Heuristics
Chapter 12: Divide and Conquer Heuristics
Unit 3: Meta-heuristics
Chapter 13: Genetic Algorithms
Chapter 14: Simulated Annealing
Chapter 15: Tabu Search
Chapter 16: Ant Colony Optimization
Chapter 17: Particle Swarm Optimization
Chapter 18: Differential Evolution
Chapter 19: Artificial Bee Colony
Chapter 20: Harmony Search
Unit 4: Hybrid Algorithms
Chapter 21: Memetic Algorithms
Chapter 22: Metaheuristic – Exact Algorithms
Chapter 23: Metaheuristic – Heuristic Algorithms
Chapter 24: Hyper-Heuristics
Chapter 25: Hybrid Evolutionary Algorithms
Optimization algorithms have become fundamental to a wide range of disciplines including computer
science, operational research, economics, engineering, and artificial intelligence. These are designed to
find the best
solution from a set of the feasible solutions, often under a given set of constraints. The problem
introduced
in this paper is a customer oriented preference based facility location-allocation problem i.e. is critical
area
in operational research and sub category of supply chain management. over the years, various
optimization
algorithms are developed to solve these kind of problems. There are various types of optimization
algorithms.
These algorithms find the solution quickly and these are explained as follows:
1.1 Exact Algorithms
The exact algorithms are the optimization algorithms that converge to optimal solution. These kind of
algo-
rithms guarantee that the solution produced by the algorithms is the best possible solution. These
algorithms
explore the entire solution space and ensure the optimality of the solution. The key characteristics of the
exact
algorithms are optimality (guarantee to find the best possible solution known as optimal solution),
complete-
ness (systematically explore all possible solutions), deterministic (exact algorithms follow a predictable
path;
for the same problem, always produce same solution), and complexity (takes longer computational time
to
converge due to exhaustive nature of the search process). The different exact algorithms are discussed
below:
• Branch and Bound: The branch and bound (B & B) algorithm systematically explore the solution space
by dividing it into smaller sub-problems (branching) and then calculating the bounds on the best possible
solution withing each sub-problem. Sub-problems that can not produce better solutions than the current
best solution are discarded (bounding). This algorithm perform excellent to solve the pure integer pro-
gramming problems.
• Cutting Plane Method: The cutting plane method starts with a relaxed version of the sub-problem
(ignoring
some constraints mainly integer restrictions) and iteratively adds linear constraints (cuts) that exclude
portions of the search space without excluding the feasible integer points. This algorithm primarily used
to solve all kind of integer programming and combinatorial optimization problems.
• Branch and Cut: The Branch and Cut (B & C) is a hybrid of branch and bound algorithm and cutting
plane method. It branches on the possible solutions and uses cutting planes to reduce the feasible region
in which the solution is searched. This algorithm is useful to solve the large-scale integer programming
problems.
• Column Generation Method: The column generation algorithm is used to solve large-scale linear
program-
ming problems. Instead of considering all possible variables (columns) at once, the algorithm begins with
manageable
• Branch and Price: The Branch and Price (B & P) algorithm combines branch and bound with column
generation. This method decompose the problem into a master problem and sub-problems to converge
the solution to optimality. This algorithm commonly used in large-scale linear programming problems,
especially in vehicle routing and crew scheduling problems.
• Enumeration Algorithm: This algorithm is mainly used to solve binary integer programming problems.
It
explicitly generate and evaluate all possible solutions, due to factorial growth in the number of possible
solutions this method is feasible for small-scale combinatorial problems.
The exact algorithms are powerful tools for finding optimal solutions for optimization problems. Due to
computational intensity, the exact algorithms has the following limitations: (I) Scalability - As problem
size
1
increases, the number of potential solutions can grow exponentially, making exact algorithms
computationally
infeasible for large-scale problems. (II) Computational Resources - Exact algorithms often require
significant
computational power and memory, which can be a limiting factor for very large and complex problems.
(III) Time Constraints - For real-time applications, exact algorithms might be too slow to deliver solutions
within the required time-frame. For large-scale optimization problems heuristics or meta-heuristics are
used
as alternatives, trading off optimality for speed and feasibility.
1.2 Heuristics
Heuristic algorithms optimization methods that employ practical, rule-of-thumb strategies to find near-
optimal solutions within a reasonable time-frame. Unlike exact algorithms, heuristics do not guarantee
to
find optimal solution. The key characteristics of the heuristics are efficiency (designed to find good solu-
tions quickly with significantly less computational time), approximation (provide approximate solutions
and
quality of the solutions can vary depending on the problem and specific heuristic used), simplicity (sim-
pler to implement and understand, accessible to wide range of applications), and flexibility (heuristics
can
be adapted and modified for different types of problems, making them versatile tools in optimization
and
problem-solving). The common types of heuristics are explained below:
1. Greedy Algorithms: A greedy algorithm makes a series of decisions and choose best option at each
step. It generates a solution piece by piece, always choose next piece that offers the most immediate
benefit. Prim’s and Kruskal’s algorithms to solve minimal spanning tree and Dijkstra’s algorithm to
solve the shortest path problem are some examples of greedy algorithms. Greedy algorithms can be
shortsighted, leading to suboptimal solutions in cases where the best local choice does not lead to the
best global solution.
2. Local Search Algorithms: Local search algorithms start with an initial solution and iteratively
improve it by making small changes such as swapping elements or adjusting parameters. The searching
process continues until no further improvements can be made or a stopping criteria is met. The local
search techniques are used to solve large-scale problems like vehicle routing and job scheduling. But,
these algorithms can get stuck in local optima, where no further small improvements lead to a better
solution, even though a better global solution exists.
3. Constructive heuristics: Constructive heuristics build a solution from scratch by adding components
one by one, based on a specific criteria, until a complete solution is formed. These kind of heuristics are
used in scheduling, clustering, and routing problems. The final solution quality heavily depends on the
order in which components are added, which may lead to sub-optimal solutions.
4. Randomized Heuristics: These algorithms introduce randomness into the search process, either by
randomly generating initial solutions or by making random choices during the search. This helps in
exploring a broader solution space and avoiding local optima. These are used in problems like random
walks, Monte Carlo methods and certain types of graph algorithms. While randomness can help escape
local optima, it may also lead to less predictable performance and inconsistent solution quality.
5. Nearest Neighbor Heuristic: Starting from an initial point, the algorithm repeatedly selects the
nearest unvisited point as the next step in building the solution. Can result in poor-quality solutions,
especially in cases where early decisions limit the effectiveness of later decisions.
6. Divide and Conquer Heuristics: This algorithm breaks a large problem into smaller sub-problems,
solves each sub-problem individually, and then combines their solutions to form a complete solution.
Used
in sorting algorithms (e.g. merge-sort and quick-sort) and matrix multiplication. The divide-and-conquer
approach can lead to sub-optimal solutions if the sub-problems are not independent or if combining the
solutions is complex.
Heuristic algorithms are valuable tools for solving complex and large-scale optimization problems where
exact methods are impractical. They strike a balance between solution quality and computational
efficiency,
making them useful in many real-world applications. Heuristics have some limitations as follows: (I) The
primary limitation of the heuristics is that they may not converge to an optimal solution, and the solution
quality and computational time is problem and heuristic specific. (II) Many heuristics are tailored to
specific
types of problems, and their effectiveness may diminish when applied to different problem domains. (III)
Unlike exact algorithms, heuristics typically lack theoretical guarantees on the performance or accuracy
of
2
the solution, making it challenging to assess the solution’s quality. (IV) Heuristics often get trapped in
local
optima, particularly in complex, multi-modal search spaces. (V) Each heuristic requires problem specific
adjustments and parameter tuning to get solution efficiently.
1.3 Meta-Heuristics
Meta-heuristics are high-level problem-solving strategies designed to find near-optimal solutions for
complex
optimization problems. Meta-heuristics guide and enhance the performance of heuristics to explore the
solution space more effectively, often combining multiple heuristics or incorporating elements like
randomness,
memory, and learning. The key characteristics of meta-heuristics are: (I) Guidance of Search -
Metaheuristics
provide the framework to explore the search space and help to avoid the local optima by searching the
solution space more broadly. (II) Flexibility - They are adaptable to a wide range of optimization
problems
without needing significant modifications, making them versatile tools in optimization. (III) Balance
Between
Exploration and Exploitation - Meta-heuristics typically balance exploration (searching new areas of the
solution space) and exploitation (refining known good solutions) to find high-quality solutions efficiently.
(IV)
No Guarantee of Optimality - Like heuristics, meta-heuristics do not guarantee finding the global
optimum,
but they often deliver good solutions within a reasonable time. (V) Stochastic Nature - Many meta-
heuristics
incorporate randomness in their search process, leading to variability in the solutions found across
different
runs. The common types of meta-heuristics are discussed below:
1. Genetic Algorithm: Genetic algorithm (GA) is inspired by the process of natural selection. They
start with a population of candidate solutions (individuals) and evolve this population over successive
generations. Individuals are selected based on their fitness, and new individuals are created through
crossover (combining parts of two solutions) and mutation (random changes). GA is used to solve almost
all disciplines as it is effective at exploring a large solution space and finding global solutions, especially
for complex, multi-modal problems, but requires careful tuning of parameters such as population size,
crossover rate, and mutation rate. Can be computationally expensive.
2. Simulated Annealing: Simulated annealing is inspired by the annealing process in metallurgy. The algo-
rithm starts with an initial solution and explores the solution space by making small random changes. A
worse solution may be accepted with a certain probability, which decreases over time (simulating
cooling),
allowing the algorithm to escape local optima. These are used in continuous optimization problems, neu-
ral network training, and inventory management. Performance of this is sensitive to the cooling schedule
(the rate at which the probability of accepting worse solutions decreases). May require many iterations
to converge to a good solution.
3. Tabu Search: Tabu search is a local search algorithm that uses memory structures to avoid cycling back
to previously visited solutions. It maintains a list of tabu (forbidden) solutions that can not be revisited
for a certain number of iterations. The tabu search technique encourages the exploration of new areas in
solution space and avoids local optima. This algorithm requires careful parameter tuning and is a
complex
task to implement it.
4. Ant Colony Optimization: The Ant Colony Optimization (ACO) algorithm is inspired by foraging
behavior of ants, ACO uses a colony of artificial ants to explore the solution space. Ants leave pheromone
trails on the path when they move and the intensity of the trails influences the path taken by the
subsequent ants. Over time, these trails guide the colony towards high-quality solutions. The common
application of this algorithm are vehicle routing, network optimization and scheduling problems. May
converge prematurely if the pheromone trail becomes too concentrated on sub-optimal paths.
5. Particle Swarm Optimization: The Particle Swarm Optimization (PSO) is inspired by the social
behavior or fish. A population of particles (candidate solutions) moves through the solution space,
with each particle adjusting its position based on its own experience and the experience of neighboring
particles. Particles are influenced by their best-known position and the best-known positions of their
neighbors. Used in continuous optimization problems, neural network training, and function optimiza-
tion. Its performance depends on the balance between exploration and exploitation.
6. Differential Evolution: Differential Evolution (DE) is an evolutionary algorithm that operates on a
population of candidate solutions. New solutions are generated by combining existing solutions using a
differential mutation strategy. The new solutions are then evaluated and selected based on their fitness.
Applied in continuous optimization, engineering design, and parameter estimation problems.
Performance
can be sensitive to the choice of mutation strategy and control parameters. May struggle with highly
complex or multi-modal landscapes.
3
7. Artificial Bee Colony Artificial Bee Colony (ABC) is inspired by the foraging behavior of honey bees.
It involves three types of bees: employed bees, onlookers, and scouts. Employed bees search for food
sources (solutions) and share this information with onlookers, who then choose food sources to exploit.
Scouts search for new food sources when the current ones are exhausted. Used in combinatorial and
continuous optimization problems, such as scheduling, clustering, and function optimization. May
require
many evaluations to converge, and performance can be sensitive to the choice of parameters.
8. Harmony Search: Inspired by the improvisation process of musicians, Harmony Search generates new
solutions by combining existing solutions (like harmonies in music). Solutions are selected based on a
probability rule, and new harmonies are generated by either memory consideration, pitch adjustment,
or random selection. Applied in engineering optimization, resource allocation, and design problems. Per-
formance can be sensitive to the choice of parameters like harmony memory size and pitch adjustment
rate.
Meta-heuristic algorithms are powerful tools for tackling complex and large-scale optimization problems,
particularly when exact methods are impractical. They are highly flexible and can be applied to a wide
range of problem types. However, they require careful tuning and may not always guarantee the best
possible
solution. The limitations of meta-heuristics are: (I) No Guarantee of Optimality - Meta-heuristics are
designed
to find good solutions, but they do not guarantee finding the global optimum. (II) Parameter Sensitivity:
The performance of meta-heuristics often depends heavily on the choice of parameters, which may
require
extensive tuning. (III) Computational Cost - While generally more efficient than exact algorithms for large
problems, meta-heuristics can still be computationally expensive, particularly when many iterations are
needed to converge to a good solution. (IV) Complexity - Some meta-heuristics can be complex to
implement
and require a deep understanding of both the algorithm and the problem domain.
1.4 Hybrid Algorithms
Hybrid algorithms combine elements from different optimization techniques—such as exact algorithms,
heuristics, and meta-heuristics—to create a more robust and effective problem-solving approach. The
goal of
hybrid algorithms is to leverage the strengths of each method while mitigating their weaknesses,
resulting in
a more powerful and flexible solution approach. The key characteristics of hybrid algorithms are: (I) Com-
bination of Methods: Hybrid algorithms integrate multiple optimization strategies. For example, a hybrid
algorithm might combine the global search capabilities of a meta-heuristic with the precise local search
of
a heuristic or exact method. (II) Enhanced Performance - By combining different techniques, hybrid algo-
rithms aim to improve the overall performance in terms of solution quality, speed, and robustness
compared
to using a single method alone. (III) Flexibility - These algorithms are often more adaptable to a variety of
problem types and can be tailored to specific needs or constraints of the problem. (IV) Complexity -
While
hybrid algorithms can offer better performance, they are often more complex to design and implement
due
to the integration of multiple methods. The different types of hybrid algorithms are as follows:
1. Hybrid Meta-heuristic - Exact Algorithms: These algorithms combine the exploration abilities of a
meta-heuristic (such as Genetic Algorithms or Simulated Annealing) with the precision of exact
algorithms
(like Branch and Bound or Dynamic Programming). The meta-heuristic is used to explore the solution
space broadly, while the exact algorithm refines the best solutions found. A Genetic Algorithm could be
used to find a good initial solution, which is then fine-tuned using a Branch and Bound method to
achieve
the optimal solution. Common in combinatorial optimization problems like vehicle routing, scheduling,
and network design, where finding an exact solution is computationally challenging but still desirable.
2. Hybrid Meta-heuristic - Heuristic Algorithms: These hybrids typically use a meta-heuristic to
explore the global solution space and a heuristic to perform a more detailed local search. The heuristic is
often applied to refine solutions found by the meta-heuristic or to escape local optima. In Tabu Search, a
local search heuristic might be used in conjunction with the tabu list management to escape local
optima,
with additional elements of Simulated Annealing to probabilistically accept worse solutions temporarily.
Suitable for problems like the Traveling Salesman Problem (TSP), where a good global search combined
with an effective local improvement can yield high-quality solutions.
3. Memetic Algorithms: Memetic algorithms are a specific type of hybrid meta-heuristic that combines
Genetic Algorithms with local search techniques. Each individual in the population undergoes a local
search process after the standard genetic operations (crossover, mutation) to improve its fitness before
the
next generation is produced. Used in various optimization problems, including job scheduling, machine
learning, and combinatorial optimization. Can effectively combine the global search capabilities of
Genetic
Algorithms with the precision of local search, leading to better and faster convergence to high-quality
solutions.
4
4. Hyper-Heuristics: Hyper-heuristics are a high-level search method that selects or generates heuris-
tics (low-level heuristics) to solve optimization problems. Instead of directly solving the problem,
hyper-heuristics choose the best heuristic or combination of heuristics during the search process. A
hyper-
heuristic could be used to dynamically select between a set of heuristics (like Greedy, Local Search, or
Simulated Annealing) based on their performance during the search, adapting to the problem’s charac-
teristics as the search progresses. Often used in scheduling, bin packing, and timetabling problems
where
no single heuristic performs well across all problem instances.
5. Hybrid Evolutionary Algorithms: These algorithms combine evolutionary algorithms (like Genetic
Algorithms or Evolution Strategies) with other optimization techniques to improve performance. The
hybridization can occur at different stages of the evolutionary process, such as selection, crossover, or
mutation. Combining Differential Evolution with a local search algorithm to refine the solutions
generated
by the evolutionary process. Used in complex optimization problems in engineering, machine learning,
and economic modeling.
Hybrid algorithms represent a powerful approach to solving complex optimization problems by
combining
the strengths of multiple optimization techniques. Even there are various benefits of hybrid algorithms
such
as (i) Improved Solution Quality: By combining different methods, hybrid algorithms often produce
higher-
quality solutions than using a single technique. (ii) Faster Convergence: The complementary strengths of
different techniques can lead to faster convergence towards good solutions, particularly in large or
complex
search spaces. (iii) Greater Robustness: Hybrid algorithms are often more robust, performing well across
a
wider range of problem instances. (iv) Flexibility and Adaptability: They can be tailored to specific prob-
lems, making them versatile for various applications. But, they suffer from various challenges such as (I)
Complexity of Design: Designing an effective hybrid algorithm requires careful integration of different
meth-
ods, which can be complex and time-consuming. (II) Parameter Tuning: Hybrid algorithms often require
extensive parameter tuning to balance the contributions of different methods effectively. (III)
Computational
Overhead: While hybrid algorithms can offer improved performance, they may also introduce additional
computational overhead, particularly if the methods being combined are computationally inte