Dynamic Programming
Dynamic Programming
C Patvardhan
Professor, Electrical Engineering
Dayalbagh Educational Institute, Agra
Algorithm
g types
yp
Algorithm types we will consider include:
Simple recursive algorithms
Backtracking algorithms
Di id andd conquer algorithms
Divide l ith
Dynamic programming algorithms
Greedy algorithms
Branch and bound algorithms
Brute force algorithms
g
Randomized algorithms
Countingg coins
To find the minimum number of US coins to make any amount,
the greedy method always works
At each step, just choose the largest coin that does not overshoot the
desired amount: 31¢=25
The greedy method would not work if we did not have 5¢ coins
For 31 cents, the greedy method gives seven coins (25+1+1+1+1+1+1),
but we can do it with four (10+10+10+1)
( )
The greedy method also would not work if we had a 21¢ coin
For 63 cents, the greedy method gives six coins (25+25+10+1+1+1), but
we can do it with three (21+21+21)
How can we find the minimum number of coins for any given
coin set?
Coin set for examples
p
For the following examples, we will assume coins in the
f ll i denominations:
following d i i
1¢ 5¢ 10¢ 21¢ 25¢
W ’ll use 63¢ as our goall
We’ll
Find
Fi d the
th minimum
i i number
b off coins
i needed
d d tto make
k K - i cents
t
Choose the i that minimizes this sum
This algorithm
g can be viewed as divide-and-conquer,
q , or as brute
force
This solution is very recursive
It requires exponential work
It is infeasible to solve for 63¢
Another solution
We can reduce the problem recursively by choosing the
first coin
coin, and solving for the amount that is left
For 63¢:
One 1¢ coin plus the best solution for 62¢
One 5¢ coin plus the best solution for 58¢
One 10¢ coin plus the best solution for 53¢
One 21¢ coin plus the best solution for 42¢
One 25¢ coin plus the best solution for 38¢
Choose the best solution from among the 5 given above
Instead of solving 62 recursive problems, we solve 5
This is still a very expensive algorithm
A dynamic
y pprogramming
g g solution
Idea: Solve first for one cent, then two cents, then three cents,
etc up to the desired amount
etc.,
Save each answer in an array !
For each new amount N, compute
p all the ppossible ppairs of
previous answers which sum to N
For example, to find the solution for 13¢,
First, solve for all of 1¢,
First 1¢ 2¢,
2¢ 3¢,
3¢ ..., 12¢
Next, choose the best solution among:
Solution for 1¢ + solution for 12¢
The longest
g simplep ppath (p y ) from A
(path not containingg a cycle)
to D is A B C D
However, the subpath A B is not the longest simple path
f
from A to B (A C B is
i longer)
l )
The principle of optimality is not satisfied for this problem
Hence the longest simple path problem cannot be solved by
Hence,
a dynamic programming approach
The 0-1 knapsack
p pproblem
A thief breaks into a house, carrying a knapsack...
He can carry up tto 25 pounds
H d off lloott
He has to choose which of N items to steal
Each item has some weight and some value
“0-1” because each item is stolen (1) or not stolen (0)
He has to select the items to steal in order to maximize the value of his
loot, but cannot exceed 25 pounds
A greedy algorithm does not find an optimal solution
A dynamic programming algorithm works well
Thi is
This i similar
i il to, but
b not identical
id i l to, theh coins
i problem
bl
In the coins problem, we had to make an exact amount of change
In the 0-1 knapsack
p pproblem, we can’t exceed the weight
g limit, but the
optimal solution may be less than the weight limit
The dynamic programming solution is similar to that of the coins problem
Comments
Dynamic programming relies on working “from the bottom up”
and saving the results of solving simpler problems
These solutions to simpler problems are then used to compute the solution
to more complex problems
Dynamic programming solutions can often be quite complex and
tricky
Dynamic programming is used for optimization problems,
problems
especially ones that would otherwise take exponential time
Only problems that satisfy the principle of optimality are suitable for
d
dynamic
i programmingi solutions
l ti
Since exponential time is unacceptable for all but the smallest
pproblems,, dynamic
y programming
p g g is sometimes essential
Longest Common Subsequence
Problem: Given 2 sequences, X = 〈x1,...,xm〉 and
Y = 〈y1,...,yn〉, find a common subsequence whose
length is maximum.
Notation:
prefix Xi = 〈x1,...,xi〉 is the first i letters of X.
This says what any longest common subsequence must look like;
do you believe it?
Optimal Substructure
Theorem
Let Z = 〈〈z1, . . . , zk〉 be anyy LCS of X and Y .
1. If xm = yn, then zk = xm = yn and Zk-1 is an LCS of Xm-1 and Yn-1.
2. If xm ≠ yn, then either zk ≠ xm and Z is an LCS of Xm-1 and Y .
3
3. or zk ≠ yn and Z is an LCS of X and Yn-1.
c[springtime printing]
c[springtime,
p r i n t i n g
•Keep ,β] in a
p track of c[[α,β
table of nm entries: s
p
•top/down
r
•bottom/up i
n
g
t
i
m
e
Computing the length of an LCS
LCS-LENGTH (X, Y)
1. m ← length[X]
2. n ← length[Y]
3. for i ← 1 to m
4. do c[i, 0] ← 0
5. for j ← 0 to n
6. do c[0, j ] ← 0 b[i, j ] points to table entry
7 for i ← 1 to m
7.
8. do for j ← 1 to n
whose subproblem we used
9. do if xi = yj in solving LCS of Xi
10
10. c[i j ] ← c[i−1,
then c[i, 1 j−1] + 1 and Yj.
11. b[i, j ] ← “ ”
12. else if c[i−1, j ] ≥ c[i, j−1]
13. then c[i, [ − 1,, j ]
[ , j ] ← c[i c[m,n] contains the length
14. b[i, j ] ← “↑” of an LCS of X and Y.
15. else c[i, j ] ← c[i, j−1]
16. b[i, j ] ← “←” Time: O(mn)
17. return c and b
Constructing an LCS
PRINT-LCS (b, X, i, j)
1. if i = 0 or j = 0
2. then return
3. if b[i, j ] = “ ”
4. then PRINT-LCS(b, X, i−1, j−1)
5
5. print
i t xi
6. elseif b[i, j ] = “↑”
7. then PRINT-LCS(b, X, i−1, j)
8. else PRINT-LCS(b, X, i, j−1)
k2 i depthT(ki) depthT(ki)·pi
1 1 0.25
2 0 0
k1 k5 3 3 0.15
4 2 0.4
5 1 0.3
03
1.10
k4
Therefore, E[search cost] = 2.10.
T′
ki kr-1 kr+1 kj
To find an optimal BST:
» Examine all candidate roots kr , for i ≤ r ≤ j
» Determine all optimal BSTs containing ki,...,kr−1 and
containingg kr+1,,...,k
, j
Recursive Solution
Find optimal BST for ki,...,kj, where i ≥ 1, j ≤ n, j ≥ i−1.
When j = i−1, the tree is empty.
Define e[i, j ] = expected search cost of optimal BST for ki,...,kj.
from (15.16)
(15 16)
Time: O(n3)
Elements of Dynamic Programming
Optimal substructure
Overlapping subproblems
Optimal Substructure
Show that a solution to a problem consists of making a
choice, which leaves one or more subproblems to solve.
Suppose that you are given this last choice that leads to an
optimal solution.
Given this choice, determine which subproblems arise and
how to characterize the resulting space of subproblems.
Show that the solutions to the subproblems used within
the optimal solution must themselves be optimal. Usually
use cut-and-paste.
cut and paste
Need to ensure that a wide enough range of choices and
subproblems are considered.
considered
Optimal Substructure
Optimal substructure varies across problem domains:
» 1. How many subproblems are used in an optimal solution.
» 2. How many choices in determining which subproblem(s) to
use.
IInformally,
f ll runningi time
ti depends
d d on (# off subproblems
b bl
overall) × (# of choices).
How many subproblems and choices do the examples
considered contain?
Dynamic programming uses optimal substructure bottom
up.
» First find optimal solutions to subproblems.
» Then choose which to use in optimal solution to the problem.
Optimal Substucture
Does optimal substructure apply to all optimization
problems? No.
Applies to determining the shortest path but NOT the
longest simple path of an unweighted directed graph.
Why?
» Shortest path has independent subproblems.
» Solution to one subproblem does not affect solution to another
subproblem of the same problem.
» Subproblems are not independent in longest simple path.
path
• Solution to one subproblem affects the solutions to other subproblems.
» Example:
Overlapping Subproblems
The space of subproblems must be “small”.
The total number of distinct subproblems
p is a polynomial
p y
in the input size.
» A recursive algorithm is exponential because it solves the same
problems
bl repeatedly.
dl
» If divide-and-conquer is applicable, then each problem solved
will be brand new.