Application of Greedy Method
Application of Greedy Method
An algorithm is a mathematical procedure often used to solve optimization problems in a finite number of steps.
Most solution algorithms make choices based on a global overview of all current and future possibilities aiming at reaching the single global optimum solution.
Greedy algorithms make choices that look best at that every moment.
When compared to algorithms that guarantee to yield a global optimum solution, greedy algorithms have several advantages:
They are easier to implement, they require much less computing resources, they are much faster to execute.
Their only disadvantage being that they do not always reach the global optimum solution
when the global optimum solution is not reached, most of the times the reached sub- optimal solution is a very good solution. Several classical optimization problems like minimum spanning tree and optimal prefix codes for data compression yield global optimum solutions using greedy algorithm.
A tree T is said to be a spanning tree of a connected graph G if T is a subgraph of G and T contains all vertices of G. Since the vertices of G are barely hanging together in a spanning tree, it is a sort of skeleton of the original graph g. so, a spanning tree is called a skeleton or scaffolding of G. Among the spanning trees of G, one with the smallest weight is called minimal spanning tree or shortest spanning tree or shortest-distance spanning tree.
Steps in Kruskals algorithm: list all edges of G in order of increasing weight. Select a smallest edge of G. For each successive step, select (from all remaining edges of G) another smallest edge that makes no circuit with the previously selected edges. Continue until n-1 edges have been selected.
At each stage, the algorithm chooses an edge to add to its current partial solution. To do so, it needs to test each candidate edge (u ,v ) to see whether the endpoints u and v lie in different components; otherwise the edge produces a cycle.
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
enqueue edges of G in a queue in increasing order of cost. T=; while(queue is not empty) { dequeue an edge e; if(e does not create a cycle with edges in T) add e to T; } return T;
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F cycle!!
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
4
C 3 E
Kruskal's Algorithm
6
2 1 4 D
2
3 2 F
2
C 3 E
Kruskal's Algorithm
2 1
D 2 F
b
2
a
8
11 7
i
6
14 10
h
1
g
2
8 4
b
11 7
2
c i
d
14
a
8
e
10
8 4
b
11 7 2
c i
d
14
a
8
e
10
19
8 4
b
11 7 2
c i
d
14
a
8
e
10
8 4
b
11
7 2
c i
d
14
a
8
e
10
f
20
8 4
b
11 7 2
c i
d
14
a
8
e
10
8 4
b
11 7 2
c
i
d
14
a
8
e
10
21
8
4
b
11 7 2
c i
d
14
a
8
e
10
8 4
b
11 7 2
c
i
d
14
a
8
e
10
22
8 4
b
11 7 2
c i
d
14
a
8
e
10
b
11 7 2
c
i
d
14
a
8
e
10
23
Initialization O(V) time Sorting the edges Q(E lg E) = Q(E lg V) (why?) O(E) calls to FindSet Union costs
Let t(v) the number of times v is moved to a new
cluster Each time a vertex is moved to a new cluster the size of the cluster containing the vertex at least doubles: t(v) log V t (v) V log V Total time spent doing Union vV
minimum distance path to reach the destination much faster. They will never circulate their bus through unwanted towns and villages.