AP_Dynamic Programming_ Algorithms and Complexity Ana
AP_Dynamic Programming_ Algorithms and Complexity Ana
Complexity Analysis
Dynamic programming (DP) is both a mathematical optimization method and an algorithmic
paradigm developed by Richard Bellman in the 1950s. It simplifies complex problems by
breaking them down into simpler subproblems and solving them recursively. Two key attributes
make a problem suitable for dynamic programming: optimal substructure (the solution to the
main problem can be constructed from optimal solutions of its subproblems) and overlapping
subproblems (the same calculations are performed multiple times) [1] [2] .
Problem Definition
Given a set of items, each with a weight and a value, determine which items to include in a
collection so that the total weight is less than or equal to a given limit and the total value is
maximized. Unlike fractional knapsack, items cannot be broken down – you either take an item
completely or leave it [4] .
Algorithm
return dp[n][W]
Example
Consider the following items:
A 1 2
B 3 4
C 5 7
Item Weight Value
D 7 10
Complexity Analysis
Time Complexity: O(N*W) where N is the number of items and W is the knapsack
capacity [6]
Space Complexity: O(N*W) for storing the DP table [6]
Problem Definition
Given two sequences, find the longest subsequence common to both sequences. A
subsequence is a sequence that appears in the same relative order but not necessarily
contiguous [7] .
Algorithm
Complexity Analysis
Time Complexity: O(m*n) where m and n are the lengths of the two sequences [8]
Space Complexity: O(m*n) for storing the DP table [8]
Problem Definition
Given a set of cities and the distance between every pair of cities, find the shortest possible
route that visits each city exactly once and returns to the origin city [9] .
Algorithm
// If only one vertex in subset, and it's not the start, invalid
if mask == (1 << end) and end != start:
continue
// Update dp value
dp[mask][end] = min(dp[mask][end], dp[prev_mask][prev] + dist[prev][end])
return answer
Example
For a complete graph with 4 vertices, the algorithm computes the minimum cost path visiting all
vertices and returning to the start [9] .
Complexity Analysis
Time Complexity: O(n²*2ⁿ) where n is the number of cities [11]
Space Complexity: O(n*2ⁿ) due to the storage of states [11]
Problem Definition
Find the shortest paths from a source vertex to all other vertices in a weighted graph, even with
negative edge weights (but without negative cycles) [12] .
Algorithm
Example
For a graph with negative edge weights but no negative cycles, Bellman Ford correctly
computes the shortest paths from the source vertex to all others [13] .
Complexity Analysis
Time Complexity: O(V*E) where V is the number of vertices and E is the number of
edges [13]
Space Complexity: O(V+E) using adjacency list representation [14]
Problem Definition
Find shortest paths between all pairs of vertices in a weighted graph, which may contain
negative edge weights but no negative cycles [15] .
Algorithm
function FloydWarshall(graph):
// Initialize distance matrix with direct edges
dist[1...V][1...V] = graph
return dist
Example
For a directed weighted graph with 4 vertices, Floyd-Warshall computes the shortest distances
between all pairs of vertices [17] .
Complexity Analysis
Time Complexity: O(V³) due to three nested loops [18]
Space Complexity: O(V²) for storing the distance matrix [18]
Problem Definition
Construct a binary search tree that minimizes the expected search cost, given the probabilities
of successful and unsuccessful searches for keys [19] .
Algorithm
Example
For keys {10, 20, 30} with probabilities {0.5, 0.1, 0.05} and unsuccessful search probabilities
{0.15, 0.1, 0.05, 0.05}, the optimal BST has an expected cost of 1.75 [20] .
Complexity Analysis
Time Complexity: O(n³) in the basic form, but can be reduced to O(n²) [19]
Space Complexity: O(n²) for the cost and root tables [21]
Problem Definition
Given a set of coin denominations and a target amount, find the minimum number of coins
needed to make the amount (or the number of different ways to make the amount) [22] .
return dp[amount]
Example
For coins {1, 2, 5} and amount 11, the minimum number of coins needed is 3 (5+5+1) [22] .
Complexity Analysis
Time Complexity: O(amount * number of coin types) [22]
Space Complexity: O(amount) [22]
Problem Definition
Given a sequence of matrices, find the most efficient way to multiply these matrices together to
minimize the number of scalar multiplications [23] .
Algorithm
// L is chain length
for L from 2 to n-1:
for i from 1 to n-L:
j = i+L-1
m[i][j] = infinity
return m[^1][n-1], s
Example
For matrices with dimensions 23×26, 26×27, and 27×20, the minimum number of scalar
multiplications is 26,000 [23] .
Complexity Analysis
Time Complexity: O(n³) where n is the number of matrices [24]
Space Complexity: O(n²) for the m and s tables [24]
Conclusion
Dynamic programming is a powerful technique for solving complex optimization problems
efficiently. All the algorithms discussed share the core principle of breaking down problems into
overlapping subproblems and building solutions incrementally. The time and space complexities
of these algorithms make them practical for a variety of real-world applications, from resource
allocation (Knapsack) to route optimization (TSP) to string matching (LCS).
By understanding these classic dynamic programming problems and their solutions, we can
develop the insights needed to approach new problems using similar principles of optimal
substructure and memoization.
⁂
1. https://en.wikipedia.org/wiki/Dynamic_programming
2. https://stackoverflow.blog/2022/01/31/the-complete-beginners-guide-to-dynamic-programming/
3. https://www.masaischool.com/blog/understanding-dynamic-programming-101/
4. https://www.gatevidyalay.com/0-1-knapsack-problem-using-dynamic-programming-approach/
5. https://www.tutorialspoint.com/data_structures_algorithms/01_knapsack_problem.htm
6. https://www.interviewbit.com/blog/0-1-knapsack-problem/
7. https://www.programiz.com/dsa/longest-common-subsequence
8. https://www.enjoyalgorithms.com/blog/longest-common-subsequence/
9. https://www.baeldung.com/cs/tsp-dynamic-programming
10. https://www.interviewbit.com/blog/travelling-salesman-problem/
11. https://blog.heycoach.in/time-complexity-of-tsp-algorithms/
12. https://www.shiksha.com/online-courses/articles/introduction-to-bellman-ford-algorithm/
13. https://www.scholarhat.com/tutorial/datastructures/bellman-fords-algorithm
14. https://blog.heycoach.in/bellman-ford-algorithm-and-space-complexity/
15. https://brilliant.org/wiki/floyd-warshall-algorithm/
16. https://www.tutorialspoint.com/data_structures_algorithms/floyd_warshall_algorithm.htm
17. https://www.shiksha.com/online-courses/articles/about-floyd-warshall-algorithm/
18. https://en.wikipedia.org/wiki/Floyd–Warshall_algorithm
19. https://www.scaler.com/topics/optimal-binary-search-tree/
20. https://codecrucks.com/optimal-binary-search-tree-how-to-solve-using-dynamic-programming/
21. https://blog.heycoach.in/space-complexity-of-optimal-bst-construction/
22. https://www.simplilearn.com/tutorials/data-structure-tutorial/coin-change-problem-with-dynamic-prog
ramming
23. https://www.tutorialspoint.com/data_structures_algorithms/matrix_chain_multiplication.htm
24. https://www.scaler.in/matrix-chain-multiplication/