0% found this document useful (0 votes)
7 views

Greedy Algorithms (2)

Uploaded by

samueljacso573
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Greedy Algorithms (2)

Uploaded by

samueljacso573
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Greedy Algorithms

Greedy Algorithm Introduction


Ø “Greedy method finds out of many options, but you have to choose the best
option.”
Ø In this method, we have to find out the best method/option out of many
present ways.
ØIn this approach/method we focus on the first stage and decide the output,
don’t think about the future
Cont…
ØA greedy algorithm works if a problem exhibits the following two properties
1. Greedy choice property: A globally optimal solution can be reached at by creating a
locally optimal solution. In other words, an optimal solution can be obtained by
creating “greedy” choices.
2. Optimal substructure: optimal solutions contain optimal sub solutions. In other words,
answers to sub problems of an optimal solution are optimal.
Examples:
§ Machine scheduling
§ Fractional knapsack problem
§ Minimum spanning problem
§ Huffman code
§ Job sequencing
§ Activity selection problem
Steps for achieving a Greedy Algorithm
Ø Feasible: Here we check whether it satisfies all possible constraints or not, to
obtain at least one solution to our problems.
Ø Local optimal choice: In this, the choice should be the optimum which is
selected from the currently available.
Ø Unalterable: once the decision is made, at any subsequence step that option
is not altered.
Knapsack Problem
You are given the following -
Ø A knapsack (kind of shoulder bag) with limited weight capacity.
Ø Few items each having some weight and value
The problem states –
Which items should be placed into the knapsack that –
§ the value or profit obtained by putting the items into knapsack is maximum.
§ and the weight limit of the knapsack does not exceed.
0 -1 Knapsack problem
value[] = {60, 100, 120};
Weight[] = {10, 20, 30};
W = 50;
Weight = 10; value = 60;
Weight = 20; value = 100;
Weight = 30; value = 120;
Weight = (20 + 10); value = (100 + 60);
Weight = (30 + 10); value = (120 + 60);
Weight = (30 + 20); value = (120 + 100);
weight = (30 + 20 + 10) > 50
Solution: 220
Knapsack Problem Variants
Ø Knapsack problem has the following two variants
§ Fractional knapsack problem
§ 0/1 knapsack problem
Fractional knapsack problem
In Fractional knapsack problem,
Ø As the name suggests, items are divisible here.
ØWe can even put the fraction of any item into the knapsack if taking the
complete item is not possible.
ØIt is solved using Greedy Method.
Fractional knapsack problem using Greedy Method

Fractional knapsack problem is solved using Greedy Method in the following


steps-
1. For each item, compute its value/weight ratio.
2. Arrange all the items in decreasing order of their value/weight ratio.
3. Start putting the items into the knapsack beginning from the item with the
highest ratio. Put as many items as you can into the knapsack.
0/1 knapsack Problem
0/1 knapsack problem,
Ø As the name suggests, item are indivisible here.
Ø We can not take the fraction of any item.
Ø we have to either take an item completely or leave it completely
Ø It is solved using dynamic programming approach.
Minimum Spanning Tree (MST)
ØA spanning tree is defined as a tree-like subgraph of a connected, undirected
graph that includes all the vertices of the graph.
Øit is a subset of the edges of the graph that forms a tree (acyclic) where every
node of the graph is a part of the tree.
ØThe minimum spanning tree has all the properties of a spanning tree with an
added constraint of having the minimum possible weights among all possible
spanning trees.
Properties of a Spanning Tree
§ The number of vertices (V) in the graph and the spanning tree is the same.
§ There is a fixed number of edges in the spanning tree which is equal to one less than the total
number of vertices ( E = V-1 ).
§ The spanning tree should not be disconnected, as in there should only be a single source of
component, not more than that.
§ The spanning tree should be acyclic, which means there would not be any cycle in the tree.
§ The total cost (or weight) of the spanning tree is defined as the sum of the edge weights of all
the edges of the spanning tree.
§ There can be many possible spanning trees for a graph.
Minimum Spanning Tree
Ø defined as a spanning tree that has the minimum weight among all the possible
spanning trees.
ØThe minimum spanning tree has all the properties of a spanning tree with an
added constraint of having the minimum possible weights among all possible
spanning trees.
ØLike a spanning tree, there can also be many possible MSTs for a graph.
Cont…
Kruskal's Algorithm
Ø is a minimum spanning tree algorithm that takes a graph as input and finds the subset of the
edges of that graph which
§ form a tree that includes every vertex
§ has the minimum sum of weights among all the trees that can be formed from the graph
How Kruskal's algorithm works
Ø It falls under a class of algorithms called greedy algorithms that find the local optimum in the hopes of
finding a global optimum.
ØWe start from the edges with the lowest weight and keep adding edges until we reach our goal.
ØThe steps for implementing Kruskal's algorithm are as follows:
1. Sort all the edges from low weight to high
2. Take the edge with the lowest weight and add it to the spanning tree. If adding the edge created a
cycle, then reject this edge.
3. Keep adding edges until we reach all vertices.
Example of Kruskal's algorithm
Cont…
Cont…
Cont…
Cont…
Cont…
Example 2
Weight Source Destination
6
1 7

2
2 8

5
2 6

1
4 0

5
4 2

6
6 8
Now pick all edges one by one from the
7 2
3
sorted list of edges
8
7 7

7
8 0

2
8 1

4
9 3

4
10 5

11 1 7

5
14 3
Step 1: Pick edge 7-6. No cycle is formed, include it.
Step 2: Pick edge 8-2. No cycle is formed, include it.
Step 3: Pick edge 6-5. No cycle is formed, include it.
Step 4: Pick edge 0-1. No cycle is formed, include it.
Step 5: Pick edge 2-5. No cycle is formed, include it.
Step 6: Pick edge 8-6. Since including this edge results in the cycle,
discard it. Pick edge 2-3: No cycle is formed, include it.
Step 7: Pick edge 7-8. Since including this edge results in the cycle,
discard it. Pick edge 0-7. No cycle is formed, include it.
Step 8: Pick edge 1-2. Since including this edge results in the cycle,
discard it. Pick edge 3-4. No cycle is formed, include it.
Kruskal’s Algorithm Pseudocode
KRUSKAL (G):
A=φ
For each vertex v ϵ G, V:
MAKE –SET (v)
For each edge (u, v) ϵ G, E ordered by increasing order by weight (u, v):
If FIND- SET (u) ≠ FIND- SET (v):
A = A Ս {(u, v)}
UNION (U, V)
Return A
Prim's Algorithm
Ø is a minimum spanning tree algorithm that takes a graph as input and finds the subset of the
edges of that graph which
§ form a tree that includes every vertex
§ has the minimum sum of weights among all the trees that can be formed from the graph
How Prim's algorithm works
Ø It falls under a class of algorithms called greedy algorithms that find the local optimum in the
hopes of finding a global optimum.
ØWe start from one vertex and keep adding edges with the lowest weight until we reach our goal.
ØThe steps for implementing Prim's algorithm are as follows:
1. Initialize the minimum spanning tree with a vertex chosen at random.
2. Find all the edges that connect the tree to new vertices, find the minimum and add it to the
tree
3. Keep repeating step 2 until we get a minimum spanning tree
Example of Prim's algorithm
Step 1: Firstly, we select an arbitrary vertex that acts as the starting vertex of the
Minimum Spanning Tree. Here we have selected vertex 0 as the starting vertex.
Step 2: All the edges connecting the incomplete MST and other vertices
are the edges {0, 1} and {0, 7}. Between these two the edge with
minimum weight is {0, 1}. So include the edge and vertex 1 in the MST.
Step 3: The edges connecting the incomplete MST to other vertices are {0, 7}, {1, 7}
and {1, 2}. Among these edges the minimum weight is 8 which is of the edges {0, 7}
and {1, 2}. Let us here include the edge {0, 7} and the vertex 7 in the MST. [We could
have also included edge {1, 2} and vertex 2 in the MST].
Step 4: The edges that connect the incomplete MST with the fringe
vertices are {1, 2}, {7, 6} and {7, 8}. Add the edge {7, 6} and the vertex 6
in the MST as it has the least weight (i.e., 1).
Step 5: The connecting edges now are {7, 8}, {1, 2}, {6, 8} and {6, 5}.
Include edge {6, 5} and vertex 5 in the MST as the edge has the minimum
weight (i.e., 2) among them.
Step 6: Among the current connecting edges, the edge {5, 2} has the
minimum weight. So include that edge and the vertex 2 in the MST.
Step 7: The connecting edges between the incomplete MST and the
other edges are {2, 8}, {2, 3}, {5, 3} and {5, 4}. The edge with minimum
weight is edge {2, 8} which has weight 2. So include this edge and the
vertex 8 in the MST.
Step 8: See here that the edges {7, 8} and {2, 3} both have same weight
which are minimum. But 7 is already part of MST. So we will consider the
edge {2, 3} and include that edge and vertex 3 in the MST.
Step 9: Only the vertex 4 remains to be included. The minimum
weighted edge from the incomplete MST to 4 is {3, 4}.
The final structure of the MST is as follows and the weight of the edges
of the MST is (4 + 8 + 1 + 2 + 4 + 2 + 7 + 9) = 37.
Note: If we had selected the edge {1, 2} in the third step then the MST
would look like the following.

You might also like