9 Greedy Algorithm
9 Greedy Algorithm
Greedy Algorithm
Greedy algorithms are simple and Most greedy algorithms are easy to
straightforward develop, implement, and time efficient
Optimal
Optimization Problem Solution
requires either minimum or maximum resource
Greedy Method
Constraint
Reach the destination
within 2 hours only!
Feasible Solutions
Problem: Travel from school to market
Optimization Problem
requires either minimum or maximum resource
Remember: Must be in minimum cost
• Feasible - it has to satisfy the problem’s constraint
• Locally optimal - it has to be the best local choice among all Optimal
feasible choices available on a specific step
• Irrevocable - once made, it cannot be changed on subsequent Solution
steps of the algorithm
Greedy Method
Greedy Algorithms try to solve a problem by always making a
choice that looks best for the moment. Once the choice is made,
it is not taken back even if later a better choice was found.
The greedy
method is a Configurations (or states): different choices, collections, or
general algorithm values to find
design paradigm,
built on the Objective function: a score assigned to configurations, which
following we want to either maximize or minimize
elements:
Making Change
Problem Configuration
•A peso amount •A peso amount yet
using a collection of to return to a
coin amounts customer plus the At each step, we use the
coins already
returned highest possible coin from the
denominations.
At last, we are able
Objective Greedy Solution
function to reach the value of
•Always return the
•Minimize number of highest value coin 93 just by using 5
you can
coins returned coins.
Making Change
Issue with Greedy Algorithm Approach
Problem Configuration
•A peso amount •A peso amount yet
using a collection of to return to a
coin amounts customer plus the
coins already
returned
Accordingly, using the Greedy
algorithm, we will end up the
denomination 9, 1, 1, (3 coins) to
Objective Greedy Solution reach the value of 11,
function •Always return the
highest value coin
However, there is a
•Minimize number of
coins returned you can more optimal
solution. And that is
by using the
denominations 5 &
6. Using them, we
can reach 11 with only
2 coins.
Fractional Knapsack Problem
Ø Given: A set S of n items, with each item i having
ü bi – positive benefit of all of item i available
ü wi – positive weigh of all of item i available
Ø Goal: Choose items with maximum total benefit but with weight at most W.
Ø If we are allowed to take fractional amounts, then this is the Fractional Knapsack
Problem.
ü Let xi <= wi denote the amount we take of item i
ü Objective: maximize
ü Constraint:
Fractional Knapsack Problem
Ø Given: A set S of n items, with each item I having
ü bi – positive benefit of all of item i available
ü wi – positive weigh of all of item i available
Ø Goal: Choose items with maximum total benefit but with weight at most W.
Fractional Knapsack Problem
a 3 11
b 3 10
_ 2 00
c 1 010
e 1 011
Huffman Tree
Huffman Encoding
‘e’
Using this map, we can encode the text/file into a shorter binary
representation. The text “ab ab cabe” would be encoded as:
Weight of a tree
• Given a weighted connected graph, an MST is one that connects all nodes with the least total weight
Examples
• Circuitry – shortest length of pin connections to make them equivalent, weights are lengths
• Airports – least cost of inter-city flights to reach all cities; weight of each edge is ticket price
Minimum Spanning Trees
Two MST Greedy Algorithm
• Kruskal’s Algorithm
• Prim’s Algorithm
Kruskal’s Algorithm
Cycle Property
This is a greedy
Grows the MST by
algorithm, taking the
adding a least weighted
cheapest available edge
edge that will not cause
at any time that will not
a cycle
cause a cycle
The least cost is 2 and edges involved are B, D and D, T. Next least cost is 3 and associated edges are A, C and C, D.
Next cost in the table is 4. However, adding it in the tree will create a cycle so just ignore it.
We can also observe that edges with cost 5 and 6 will create cycle so just ignore them and move on.
Next cost in the table is 4. However, adding it in the tree will create a cycle so just ignore it.
We can also observe that edges with cost 5 and 6 will create cycle so just ignore them and move on.
To complete the spanning tree, only one edge must be added to connect all the vertices. Between the two cost edges available (7
and 8), add the edge with minimum which is the edge with cost 7.
While the tree does not contain all vertices in the graph, find
shortest edge having the tree and add it to the tree.
Prim’s Algorithm
After choosing the root node S, we see that S,A and S,C are Check for all edges going out from the chosen vertices (S and A)
two edges with weight 7 and 8, respectively. We choose the and select the one edge which has the lowest cost and include it
edge S,A as it is lesser than the other. in the tree.
Step 3: Check outgoing edges and select the one with less cost
Prim’s Algorithm
MST Total Weight
Another Example to better understand Prim’s algorithm. 2 + 2 + 3 + 3 + 7 = 17
Check for all edges going out from the chosen vertices (S, A After adding the node D to the spanning tree, there are two
and C) and select the one edge which has the lowest cost edges going out of it having the same cost 2. Add these two
and include it in the tree. edges and it will result to the final spanning tree.
Step 3: Check outgoing edges and select the one with less cost
Shortest Path Problems
Given a weighted directed graph G(v, E), find:
Problem:
Find the shortest paths from f to every
vertex in the graph
Dijkstra’s Algorithm
Given: Problem:
Find the shortest
paths from f to
every vertex in
the graph
Q&A
Any
Question???
CP9 Greedy Algorithm