New 2019 QC
New 2019 QC
Heuristics
Solution approaches for COPs, can be split into different groups but an
immediate, though rough, classification for these methods, divides them
into two main categories:
• exact algorithms
• heuristic algorithms.
Pros.
• a proven optimal (or no feasible) solution can be obtained for every
finite size instance
• information on upper/lower bounds on the optimal solution can be
drawn at any moment
• general modeling framework and general purpose open-source and
commercial solvers
Cons.
• the size of the instances that are practically solvable is limited
• the best exact algorithms are problem specific and often require an
important software development time
Pros.
• heuristic algorithm of reasonable performance can be typically
developed rather quickly
• can examine a huge number of possible solutions in short
computational time
• many solution frameworks methods
Cons.
• cannot prove optimality
• typically do not provide theoretical bounds on the quality of the
solutions they return
• no general problem modeling framework nor general solvers
Heuristics
Heuristics
•Improved heuristics
– Metaheuristics
• Iterated Local Search
• Variable neighborhood search
• Tabu Search, Grasp, Simulated Annealing
• Genetic algorithms
• …
– Matheuristics
Heuristics
Heuristics
Heuristics
Heuristics
V = Vertex set.
Heuristics
Heuristics
Heuristics
•“Randomization”:
– make randomly one of the best “k” choices;
– make a random choice among the choices that are at
most p% worse than the best choice (according to the
myopic rule).
Multi-start
Heuristics
BEAM SEARCH
(Truncated Branch and Bound)
•Search tree (as for Branch and Bound)
•Breadth First strategy
•At each level of the search tree keep only the
best k nodes (k = Beam width).
•How to evaluate the nodes:
– Greedy algorithms;
– Optimal solution of a relaxed problem;
– Weighted sum of 1) and 2).
Heuristics
Heuristics
Heuristics
Heuristics
Heuristics
Heuristics
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
•Problem representation.
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(SYMMETRIC TSP)
1 4 1 2 3 4 5 6
6 5
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(SYMMETRIC TSP)
1 4
6 5
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(SYMMETRIC TSP)
|1|2|3|4|5|6| Æ |1|2|5|4|3|6|
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(SYMMETRIC TSP)
|1|2|3|4|5|6| Æ |1|2|5|4|3|6|
Heuristics
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
n
•Solution representation:
min ¦ c j x j
j 1
¦w
j 1
j xj dW
xij^0,1`
x1 x2 x3 x4 x5 x6 x7 x8
0 0 1 0 1 1 0 1
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
•Neighborhood:
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
x1 = 1 IF w1 d W ELSE x1 = 0
x2 = 1 IF w1 x1 + w2 d W ELSE x2 = 0
x3 = 1 IF w1 x1 + ... + wi-1 xi-1 + wi d W ELSE x3 = 0
etc.
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
•New neighborhood:
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(0/1 KNAPSACK)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(K-TSP)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(K-TSP – first representation)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(K-TSP – second representation)
•Another representation:
– introduce k fictituous cities all with the same location (the
departure city of the salesmen) and solve a unique aggregated
TSP
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(K-TSP – second representation)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(Graph Coloring)
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(Graph Coloring)
•Solution representation:
– The Graph Coloring problem can be tackled by
iterating on the number J of available colors and
searching for a feasible solution for each value of J.
– For any given value of J
• search for a J-partition of G minimizing the total
number of crossing edges (edges belonging to
the same partition);
• Every J-partition with no crossing edges is a J-
coloring of G;
Heuristics
LOCAL SEARCH:
NEIGHBORHOOD GENERATION
(Graph Coloring)
Heuristics
Heuristics
Heuristics
Metaheuristics
Iterated Local Search
• Iteratively applies neighborhood search steps in order to reach
different local minima.
Heuristics
Heuristics
Metaheuristics
Variable Neighborhood Search
Heuristics
Metaheuristics
Variable Neighborhood Search
• The local search step is typically of a first
improvement type.
Heuristics
1. Initialization:
– Select an initial solution x1 &;
– Select a set of neighborhood structures Nk with
k=1,…, kmax to be used in the procedure;
– k = 1; x* = current solution = x1.
Heuristics
2. Repeat:
– Generate randomly a solution x’ belonging to the
neighborhood Nk of the current solution x*;
– Apply a local search procedure to solution x’
yelding a new local minimum x’’;
– Acceptance test: IF x’’ improves upon x*, THEN
x*=x’’; k = 1; LOOP;
– ELSE k = k +1; LOOP;
Until k = kmax OR stopping test is satisfied.
Heuristics
Metaheuristics
Variable Neighborhood Descent
Heuristics
Heuristics
Metaheuristics
Greedy Randomized Adaptive Search Procedure
Heuristics
2. FOR k = 1,…,kmax DO
• Generate a solution xk by means of a greedy randomized
constructive procedure;
• Apply a local search procedure to the initial solution xk
yelding solution xk’;
• If xk’ is better than x*, then x* = xk’.
Heuristics
Metaheuristics
Simulated Annealing
• First Improvement like Local Search: tries to avoid local minima.
• Initially both better and worse solutions (w.r.t. the current solution)
are accepted (so-called phase of “high temperature”) and this allows a
good exploration of the solutions space.
Heuristics
Main steps of
Simulated Annealing
1) Initialization:
Select an initial solution x1∈Χ
Set the current solution: x’ = x1
Set to 0 the iterations counter: I = 0.
2) Neighborhood generation N(x’) of the current solution.
3) Select randomly a solution x’’ from N(x’);
Update the iterations counter: I = I + 1;
IF x’’ is better than or equal to x’ THEN x’ = x’’,
ELSE: −
F ( x '') − F ( x ')
Compute probability p = e T (I )
Select a random real value p∈[0,1]
IF p< p
THEN x’ = x’’,
ELSE keep the current solution unchanged
4) Stopping test: IF the stopping test is positive STOP,
ELSE GO TO 2.
Heuristics
T0
T1
T
T2(I)
T3
L 2L 3L 4L I
• T (k⋅L) = Tk = αk ⋅ T0
• T0 = initial temperature: usually the value is computed in such a way to
have an acceptance probability w.r.t. an average neighbor solution.
• α ∈[0,1] = cooling coefficient (e.g. α = 0.98)
• L = length of the “plateau” (typically proportional to the size of N()).
Heuristics
Stopping test
Typically the following alternatives are considered:
– The solution did not improve more than ε1% for a given
value of consecutive iterations (e.g. k⋅L).
– The total number of accepted moves is below ε2% in the
last k2⋅L iterations.
– A given threshold S* of has been reached.
– An overall CPU time limit τ* has been reached.
Heuristics
Metaheuristics
Tabu Search
• Steepest Descent like local search: tries to avoid
local minima.
• At each iteration all neighbor solutions (or a proper
subset of the considered neighborhood) are
evacuate.
• All decisions are deterministic.
• Main features
– “Tabu moves”
– Tabu List
– Aspiration Criteria (e.g.: an overall improving solution is always
accepted)
– Intensification / diversification in the solutions space
Heuristics
Heuristics
Implementation of a
Tabu Search procedure
To implement a Tabu Search procedure the following
items must be considered:
– Definition of a neighborhood or of a subset of such
neighborhood and evaluation of all its components.
– Selection of the moves to be inserted in the Tabu List and
representation of such moves.
– Length of the Tabu List.
– Selection of the aspiration criterion (criteria).
– Selection of some stopping test.
– Intensification / diversification techniques.
Heuristics
Genetic algorithms
Heuristics
Genetic algorithms
– selection
– reproduction
– mutation
Heuristics
Genetic algorithms
Heuristics
Heuristics
7. Some of the old individuals are eliminated from the population to allow
the entrance of the new ones.
8. The new individuals' fitness is computed and they are included in the
population. This step marks the end of a generation.
Heuristics
• Coding
• Initial population
• Fitness function definition
• Genetic operators
– Crossover
– Mutation
– Inversion
• Genetic operators selection
• Reproduction
• Parameters calibration
Heuristics
Coding
• strictly depends from the problem. For a given problem many options of
coding are possible (traditionally a binary alphabet is used for the
chromosomes), but the genetic operators definition is linked to the used
coding.
Heuristics
Initial population
Heuristics
Fitness
Heuristics
• STANDARD CROSSOVER
Operates on a couple of chromosomes: the “children" are originated from the
“parents” swapping the genes between two cutting sites ramdomly
chosen.
parent 1 1 0 1 | 0 1 1 1 | 1 0
parent 2 0 0 0 | 1 1 0 1 | 0 1
^ ^ cutting sites
child 1 1 0 1 | 1 1 0 1 | 1 0
child 2 0 0 0 | 0 1 1 1 | 0 1
Heuristics
ORDER CROSSOVER
Heuristics
Parent 1 = (1 2 3 | 5 4 6 7 | 8 9) Parent 2 = (4 5 2 | 1 8 7 6 | 9 3)
Child 1 = (x x x | 5 4 6 7 | x x) Child 2 = (x x x | 1 8 7 6 | x x)
Insert this partial list after the second cut point of child 1, obtaining
Child 1 = (2 1 8 | 5 4 6 7 | 9 3 ). Similarly, child 2 = (3 4 5 | 1 8 7 6 | 9 2).
Heuristics
1 0 1 0 1 1 1 1 0
^
1 0 0 0 1 1 1 1 0
Heuristics
1 0 1 | 0 1 1 1 | 1 0
^ ^
1 0 1 | 1 1 1 0 | 1 0
Heuristics
Heuristics
Reproduction
• Generational replacement (the new generation fully substitutes the old
one).
• Elitist technique (the best element is always kept in the new generation;
as a variant several “best elements” are kept).
• Steady state technique (most of the element are kept and just a few
elements are substituted; typically, the worst element in probability are
eliminated).
Heuristics
Parameters calibration
• Various parameters determine the behavior of a genetic algorithm. The
main ones are the following:
• Population size;
• Number of generations;
Hybridization
Matheuristics
• General-purpose (exact) MIP solvers are very sophisticated tools, but in some hard cases they
are not adequate even after clever tuning
• One is therefore tempted to quit the MIP framework and to design ad-hoc heuristics for the
specific problem at hand, thus loosing the advantage of working in a generic MIP framework
• As a matter of fact, too often a MIP model is developed only “to better describe the problem” or,
in the best case, to compute bounds for benchmarking the proposed ad-hoc heuristics
2
Federico Della Croce Heuristics 77 / 106
COMBINATORIAL OPTIMIZATION: heuristic approaches
Matheuristics
We teach engineers to use MIP models for solving their difficult problems
(telecom, network design, scheduling, etc.)
Be smart as an engineer!
Model the most critical steps in the design of your own algorithm
through MIP models, and solve them (even heuristically) through a
general-purpose MIP solver…
6
Federico Della Croce Heuristics 78 / 106
COMBINATORIAL OPTIMIZATION: heuristic approaches
Framework Description
2
Notice that also other exact methods can be applied
Federico Della Croce Heuristics 79 / 106
COMBINATORIAL OPTIMIZATION: heuristic approaches
Matheuristic applications
MARCH 2016 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Tu We Th Fr Sa Su Mo Tu We Th Fr Sa Su Mo
HEAD NURSE H H O O O O
Nurse 1 N N O O M A N N O O A A A N
Nurse 2 A O M M O M M N N O O M M A
Nurse 3 H A H N H H M A N N O O O O
Nurse 4 O A A A N O O M M N N O O M
Nurse 5 O M N O O M A O A A O N O O
Nurse 6 N N O O M A A A O M M A N O
Nurse 7 M A O A A N N O O M M A A O
Nurse 8 M O N N O O A A A O M N N O
Nurse 9 O M A O A A O M M A A O O M
Nurse 10 (50%) M O O O N N O O O O O M M N
Nurse 11 A O M M O M M N N O O M M A
Nurse 12 N O O A A N N O O A A O A A
Nurse 13 O M A N N O O M A N N O O M
Nurse 14 (Mat.) O O M M M O O H H H H H N N
Nurse 15 H H H H H O O H H H H H O O
Nurse 16 A N N O H O O H M M N N O O
LEGENDA:
M = Morning; A = Afternoon; N = Night
H = Holiday; O = Off/Rest;
3 Solution Approach.
• Solution algorithm.
• minimum personnel
Pn
i=1 xi,j,k ≥ min persj,k ∀j = 1, . . . , m, k = 1, . . . , 3
• holiday requests
xi,j,H = 1 ∀i, j ∈ holiday requestsi,j
• The first attempt is done to verify the problem model against the
real–life situation so as to check coherence and correctness.
• We can model and solve the problem by means of a MIP solver.
Solving....
• The first attempt is done to verify the problem model against the
real–life situation so as to check coherence and correctness.
• We can model and solve the problem by means of a MIP solver.
Solving....
• How it is possible?
• How it is possible?
• How it is possible?
• This means that the head nurse violates some given constraints while
generating rosters.
Relax means:
• accept (small) violations
ex. no more than 4 consecutive working days but a fifth day is accepted once a
month.
• enlarge feasible solution space
ex. minimum number of Afternoons shifts from 4 to 3.
• We want to improve solutions quality given the same CPU time (1 h.)
• Matheuristic algorithm. This means that:
• we can reuse the MIP model
• we can take advantage of solvers
• we have to design neighborhoods and implement a local search phase.
Begin
(1) choose an initial feasible solution z
(2) initialize parameters h1, h2 and h3
(3) iter := 1
(4) while iter <=3 do
(5) add neighborhood constraint to the model -- N_iter
(6) set the variables not belonging to the considered neighborhood
to their corresponding values in z
(7) solve subproblems within local time limit
(8) if f(x) < f(z) then do
(9) update solution
(10) iter:=1
(11) else do
(17) iter:=iter+1
(19) end if
(20) if the overall CPU time is greater than 3600 s. then
(21) EXIT
(22) end if
(24) end-do
End
NRP Conclusions
• MKP: well known tough NP-Hard model that represents several real
problems.
• There exist plenty of heuristics and meta-heuristics as much as exact
approaches for this problem.
• OR-Library benchmark problems with 100 - 250 - 500 variables and 5
- 10 - 30 constraints.
• Typically 0/1 multi-dimensional knapsack problems with 30 - 35
variables can be solved (by XPRESS-CPLEX etc.) to optimality
within few seconds.
• Most of the benchmark problems with 500 variables and 30
constraints are currently open with respect to optimality
n
X
max pj xj
j=1
n
X
wij xj ≤ ci i = 1, ...m
j=1
xj ∈ {0, 1} j = 1, ...n
that is
• a set of n items with profits pj > 0
• a set of m resources with capacities ci > 0 ;
• each item j consumes an amount wij ≥ 0 from each resource i ;
• the selected items must not exceed the resource capacities ci ;
• the 0 − 1 decision variables xj indicate which items are selected ;
• the goal is to choose a subset of items with maximum total profit.
Federico Della Croce Heuristics 101 / 106
COMBINATORIAL OPTIMIZATION: heuristic approaches
• Compute Z ∗ .
• Fix the n − k non-basic xj variables with largest |rj | to their xj∗ value.
• Solve to optimality (or with a time limit) the remaining core problem
with k variables where no more than γ variables are such that
xj∗ = 1 − xj .
• Fair values of k and γ are 50 and 10.