Mod 3
Mod 3
int main() {
int coins[] = {25, 10, 5, 1};
int amount = 63;
coinChange(coins, 4, amount);
return 0;
}
2. Computing Minimum Spanning Trees
A Minimum Spanning Tree (MST) of a graph is a subset of edges that connects all vertices
with the minimum possible total edge weight.
Algorithms:
1. Kruskal’s Algorithm
2. Prim’s Algorithm
Kruskal’s Algorithm:
Sort all the edges in non-decreasing order of their weight.
Pick the smallest edge. Check if it forms a cycle with the spanning-tree formed so far.
If a cycle is not formed, include this edge. Else, discard it.
Repeat until there are (V-1) edges in the spanning tree.
Example Code (C):
#include <stdio.h>
#include <stdlib.h>
typedef struct {
int u, v, w;
} Edge;
Edge edges[MAX];
int parent[MAX], rank[MAX];
int n, e;
int find(int i) {
if (parent[i] != i)
parent[i] = find(parent[i]);
return parent[i];
}
void kruskal() {
int mst_wt = 0;
for (int i = 0; i < e; i++) {
int u = edges[i].u;
int v = edges[i].v;
int w = edges[i].w;
if (find(u) != find(v)) {
printf("%d - %d\n", u, v);
mst_wt += w;
union_set(u, v);
}
}
printf("Minimum Spanning Tree weight: %d\n", mst_wt);
}
int main() {
printf("Enter the number of vertices and edges: ");
scanf("%d %d", &n, &e);
for (int i = 0; i < e; i++) {
printf("Enter edge %d (u, v, w): ", i+1);
scanf("%d %d %d", &edges[i].u, &edges[i].v, &edges[i].w);
}
kruskal();
return 0;
}
Prim’s Algorithm:
Initialize a tree with a single vertex, chosen arbitrarily from the graph.
Grow the tree by one edge: of the edges that connect the tree to vertices not yet in the
tree, find the minimum-weight edge, and transfer it to the tree.
Repeat until all vertices are in the tree.
Example Code (C):
#include <stdio.h>
#include <limits.h>
#define V 5
key[0] = 0;
parent[0] = -1;
printMST(parent, graph);
}
int main() {
int graph[V][V] = { {0, 2, 0, 6, 0},
{2, 0, 3, 8, 5},
{0, 3, 0, 0, 7},
{6, 8, 0, 0, 9},
{0, 5, 7, 9, 0} };
primMST(graph);
return 0;
}
3. Union-Find Algorithm & Their Applications
The Union-Find algorithm, also known as the Disjoint-Set Union (DSU) algorithm, is a data
structure that keeps track of a partition of a set into disjoint subsets.
Operations:
1. Find: Determine which subset a particular element is in.
2. Union: Join two subsets into a single subset.
Applications:
Finding cycles in a graph.
Network connectivity.
Kruskal’s algorithm for finding the MST.
Example Code (C):
#include <stdio.h>
void make_set(int n) {
for (int i = 0; i < n; i++) {
parent[i] = i;
rank[i] = 0;
}
}
int find(int i) {
if (parent[i] != i)
parent[i] = find(parent[i]);
return parent[i];
}
void union_set(int u, int v) {
int rootU = find(u);
int rootV = find(v);
if (rank[rootU] > rank[rootV])
parent[rootV] = rootU;
else if (rank[rootU] < rank[rootV])
parent[rootU] = rootV;
else {
parent[rootV] = rootU;
rank[rootU]++;
}
}
int main() {
int n = 5; // Number of elements
make_set(n);
union_set(0, 2);
union_set(4, 2);
union_set(3, 1);
if (find(4) == find(0))
printf("4 and 0 are in the same set\n");
else
printf("4 and 0 are in different sets\n");
if (find(1) == find(0))
printf("1 and 0 are in the same set\n");
else
printf("1 and 0 are in different sets\n");
return 0;
}
4. The Relationship in Dijkstra’s and Prim’s Algorithms
Dijkstra’s Algorithm: Used for finding the shortest path from a single source to all
other vertices in a graph.
Prim’s Algorithm: Used for finding the Minimum Spanning Tree of a graph.
Similarity: Both algorithms use a priority queue to select the next vertex to be processed.
Difference:
Dijkstra’s algorithm maintains a set of vertices with known shortest path distances,
while Prim’s algorithm maintains a set of vertices that are included in the MST.
In Dijkstra’s, the priority queue is based on shortest path estimates, while in Prim’s, it
is based on edge weights.
5. Use of Greedy Strategy in Algorithms for the Knapsack Problem and
Huffman Trees
Knapsack Problem (Fractional):
Items can be divided.
Sort items by value-to-weight ratio.
Add items to the knapsack starting with the highest ratio until the capacity is reached.
Example Code (C):
#include <stdio.h>
typedef struct {
int weight, value;
} Item;
int main() {
Item items[] = {{10, 60}, {20, 100}, {30, 120}};
int capacity = 50;
int n = sizeof(items) / sizeof(items[0]);
fractionalKnapsack(items, n, capacity);
return 0;
}
Huffman Trees:
Used for data compression.
Build a frequency table of characters.
Create a priority queue of nodes.
While there is more than one node in the queue:
o Extract two nodes with the lowest frequency.
o Create a new internal node with these two nodes as children and the sum of
their frequencies as the new frequency.
o Insert the new node into the priority queue.
The remaining node is the root of the Huffman tree.
Example Code (C):
#include <stdio.h>
#include <stdlib.h>
typedef struct {
char data;
unsigned freq;
} MinHeapNode;
typedef struct {
unsigned size;
unsigned capacity;
MinHeapNode** array;
} MinHeap;
if (smallest != idx) {
swapMinHeapNode(&minHeap->array[smallest], &minHeap->array[idx]);
minHeapify(minHeap, smallest);
}
}
minHeap->size = size;
buildMinHeap(minHeap);
return minHeap;
}
while (!isSizeOne(minHeap)) {
left = extractMin(minHeap);
right = extractMin(minHeap);
top = newNode('$', left->freq + right->freq);
insertMinHeap(minHeap, top);
}
return extractMin(minHeap);
}
int main() {
char arr[] = {'a', 'b', 'c', 'd', 'e', 'f'};
int freq[] = {5, 9, 12, 13, 16, 45};
int size = sizeof(arr) / sizeof(arr[0]);
int main() {
int arr[] = {1, 2, 3, 4};
int size = sizeof(arr) / sizeof(arr[0]);
int main() {
char X[] = "AGGTAB";
char Y[] = "GXTXAYB";
int m = strlen(X);
int n = strlen(Y);
The greedy paradigm involves making locally optimal choices at each stage with the hope of
finding a global optimum solution.
Given a set of activities with start and finish times, select the maximum number of
activities that don’t overlap.
Greedy Choice: Select the activity that finishes first.
#include <stdio.h>
// Example usage
int main() {
struct Activity activities[] = {{1, 2}, {3, 4}, {0, 6}, {5, 7}, {8, 9},
{5, 9}};
int n = sizeof(activities) / sizeof(activities[0]);
printMaxActivities(activities, n);
return 0;
}
Output:
yaml
Copy code
Selected activities:
(1, 2) (3, 4) (5, 7) (8, 9)
Example:
Consider a graph with edges and weights, find MST using Kruskal's algorithm.
Prim's Algorithm:
Start with an arbitrary vertex, grow the MST by adding the cheapest edge that
connects the MST to a vertex not yet in the MST.
Union-Find (Disjoint Set) is used to efficiently manage a partition of a set into disjoint
subsets.
Operations:
Example:
Dijkstra's Algorithm: Finds the shortest path from a single source vertex to all other
vertices.
Prim's Algorithm: Constructs a minimum spanning tree from a weighted undirected
graph.
Knapsack Problem
Greedy algorithms can be used for fractional knapsack where items can be divided:
Huffman Trees
Constructing Huffman trees uses a greedy algorithm to build an optimal prefix tree for data
compression.
Example:
Example:
These examples and explanations cover the essentials you need for your exam preparation on
greedy paradigm and dynamic programming. If you need more detailed explanations or
additional examples, feel free to ask!
Greedy Paradigm
Definition: The greedy algorithm always makes the locally optimal choice at each
step with the hope of finding a global optimum.
Example: Greedy algorithm for the Minimum Spanning Tree (MST).
Definition: Data structure to efficiently support the union and find operations.
Applications: Used in Kruskal’s algorithm to detect cycles.
Both algorithms use a priority queue (or heap) to greedily select the next vertex to add
to the MST or shortest path.
Use of Greedy Strategy in Algorithms for Knapsack Problem and Huffman Trees
Knapsack Problem: The greedy method can be used for fractional knapsack where
items can be broken down into fractions.
Huffman Trees: Greedy algorithm constructs an optimal prefix code (variable-length
codes) given a set of frequencies.
Dynamic Programming
Definition: Method for solving complex problems by breaking them down into
simpler subproblems.
Properties: Overlapping subproblems and optimal substructure.
Problem: Given a sequence of matrices, find the most efficient way to multiply these
matrices together.
Approach: Dynamic programming approach computes the minimum number of
scalar multiplications needed to compute the product of matrices.
Problem: Given two sequences, find the length of the longest subsequence present in
both of them.
Approach: Dynamic programming approach computes the length of LCS and
optionally constructs the LCS itself.
For both paradigms, understanding the underlying principles and applying them through
practice with examples and coding exercises will reinforce your understanding. Make sure to
practice writing and understanding the algorithms thoroughly to excel in your exam.
Greedy Paradigm
The greedy strategy involves making locally optimal choices at each stage with the hope of
finding a global optimum. It does not always guarantee an optimal solution but is often
efficient and easy to implement.
Problem: Given a set of activities with start and finish times, select the maximum
number of activities that can be performed by a single person, assuming that a person
can only work on one activity at a time.
return activities
# Example usage:
start_times = [1, 3, 0, 5, 8, 5]
finish_times = [2, 4, 6, 7, 9, 9]
selected_activities = select_activities(start_times, finish_times)
print("Selected activities:", selected_activities) # Output: [0, 1, 3, 4]
Kruskal's Algorithm
Sort all the edges in non-decreasing order of their weight.
Pick the smallest edge that does not form a cycle with the spanning tree formed so far.
Repeat until there are V−1V-1V−1 edges in the spanning tree.
if root_u != root_v:
if self.rank[root_u] > self.rank[root_v]:
self.parent[root_v] = root_u
elif self.rank[root_u] < self.rank[root_v]:
self.parent[root_u] = root_v
else:
self.parent[root_v] = root_u
self.rank[root_u] += 1
def kruskal_mst(graph):
edges = []
for u in range(len(graph)):
for v in range(u + 1, len(graph)):
if graph[u][v] != float('inf'):
edges.append((graph[u][v], u, v))
edges.sort()
uf = UnionFind(len(graph))
mst = []
return mst
# Example usage:
graph = [
[float('inf'), 2, float('inf'), 6, float('inf')],
[2, float('inf'), 3, 8, 5],
[float('inf'), 3, float('inf'), float('inf'), 7],
[6, 8, float('inf'), float('inf'), 9],
[float('inf'), 5, 7, 9, float('inf')]
]
mst = kruskal_mst(graph)
print("Minimum Spanning Tree (Kruskal's Algorithm):", mst)
Prim's Algorithm
def prim_mst(graph):
n = len(graph)
visited = [False] * n
mst = []
heap = []
while heap:
weight, u, parent = heapq.heappop(heap)
if visited[u]:
continue
visited[u] = True
if parent is not None:
mst.append((parent, u, weight))
for v in range(n):
if not visited[v] and graph[u][v] != float('inf'):
heapq.heappush(heap, (graph[u][v], v, u))
return mst
# Example usage:
graph = [
[float('inf'), 2, float('inf'), 6, float('inf')],
[2, float('inf'), 3, 8, 5],
[float('inf'), 3, float('inf'), float('inf'), 7],
[6, 8, float('inf'), float('inf'), 9],
[float('inf'), 5, 7, 9, float('inf')]
]
mst = prim_mst(graph)
print("Minimum Spanning Tree (Prim's Algorithm):", mst)
Union-Find is a data structure that keeps track of a set of elements partitioned into a number
of disjoint subsets.
if root_u != root_v:
if self.rank[root_u] > self.rank[root_v]:
self.parent[root_v] = root_u
elif self.rank[root_u] < self.rank[root_v]:
self.parent[root_u] = root_v
else:
self.parent[root_v] = root_u
self.rank[root_u] += 1
# Example usage:
uf = UnionFind(5)
uf.union(0, 1)
uf.union(2, 3)
uf.union(0, 2)
print("Parent array after union operations:", uf.parent)
Dynamic Programming
Dynamic Programming (DP) is a method for solving complex problems by breaking them
down into simpler subproblems and storing the results of subproblems to avoid redundant
computations.
The problem involves finding the most efficient way to multiply a given sequence of
matrices.
def matrix_chain_order(p):
n = len(p) - 1 # number of matrices
m = [[0] * n for _ in range(n)]
s = [[0] * n for _ in range(n)]
return m, s
# Example usage:
matrix_dimensions = [30, 35, 15, 5, 10, 20, 25]
m, s = matrix_chain_order(matrix_dimensions)
print("Minimum number of multiplications:", m[0][len(matrix_dimensions) -
2])
print("Optimal parenthesization:", end=" ")
print_optimal_parens(s, 0, len(matrix_dimensions) - 2)
The LCS problem involves finding the longest subsequence that appears in both given
sequences.
return c, b
# Example usage:
X = "ABCBDAB"
Y = "BDCABA"
c, b = lcs_length(X, Y)
print("Length of Longest Common Subsequence:", c[len(X)][len(Y)])
print("Longest Common Subsequence:", print_lcs(b, X, len(X), len(Y)))
These examples cover the basic concepts and implementations for each topic. Feel free to ask
for further clarification or additional examples!