0% found this document useful (0 votes)
2 views

20.Graphs

Uploaded by

papakhanna78
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

20.Graphs

Uploaded by

papakhanna78
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 118

Graphs

Graphs: Adjacency Matrix


● Example:
A 1 2 3 4
1
a 1
d 2
2 4
3
b c ??
3 4
Graphs: Adjacency Matrix
● Example:
A 1 2 3 4
1
a 1 0 1 1 0
d 2 0 0 1 0
2 4
b c 3 0 0 0 0
3 4 0 0 1 0
Graphs: Adjacency Matrix
● How much storage does the adjacency matrix
require?
● A: O(V2)
● Undirected graph  matrix is symmetric
● No self-loops  don’t need diagonal
Graphs: Adjacency Matrix
● The adjacency matrix is a dense representation
■ Usually too much storage for large graphs
■ But can be very efficient for small graphs
● Most large interesting graphs are sparse
■ For this reason the adjacency list is often a more
appropriate respresentation
Graphs: Adjacency List
● Adjacency list: for each vertex v  V, store a
list of vertices adjacent to v
● Example:
1
■ Adj[1] = {2,3}
■ Adj[2] = {3}
■ Adj[3] = {} 2 4
■ Adj[4] = {3}
● Variation: can also keep 3
a list of edges coming into vertex
Graphs: Adjacency List
● So: Adjacency lists take O(V+E) storage
Graph Searching or Traversal
● Given: a graph G = (V, E), directed or
undirected
● Goal: methodically explore every vertex and
every edge
● Ultimately: build a tree on the graph
■ Pick a vertex as the root
■ Choose certain edges to produce a tree
■ Note: might also build a forest if graph is not
connected
Breadth-First Search
● “Explore” a graph, turning it into a tree
■ One vertex at a time
■ Expand frontier of explored vertices across the
breadth of the frontier
● Builds a tree over the graph
■ Pick a source vertex to be the root
■ Find (“discover”) its children, then their children,
etc.
Breadth-First Search
● Will associate vertex “colors” to guide the
algorithm
■ White vertices have not been discovered
○ All vertices start out white
■ Grey vertices are discovered but not fully explored
○ They may be adjacent to white vertices
■ Black vertices are discovered and fully explored
○ They are adjacent only to black and gray vertices
● Explore vertices by scanning adjacency list of
grey vertices
Breadth-First Search
BFS(G, s) {
initialize vertices;
Q = {s}; // Q is a queue ; initialize to s
while (Q not empty) {
u = Remove(Q);
for each v  adj[u] {
if (color[v] == WHITE)
color[v] = GREY;
d[v] = d[u] + 1;
p[v] = u; What does d[v] represent?
Enqueue(Q, v);
What does p[v] represent?
}
color[u] = BLACK;
}
} Note: Notation color [v] indicates the value of color
attribute of node v. Similar notation is used
throughout these slides on graphs
Breadth-First Search: Example

r s t u

   

   
v w x y
Breadth-First Search: Example

r s t u

 0  

   
v w x y

Q: s
Breadth-First Search: Example

r s t u

1 0  

 1  
v w x y

Q: w r
Breadth-First Search: Example

r s t u

1 0 2 

 1 2 
v w x y

Q: r t x
Breadth-First Search: Example

r s t u

1 0 2 

2 1 2 
v w x y

Q: t x v
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 
v w x y

Q: x v u
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: v u y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: u y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: Ø
Breadth-First Search: Properties
● BFS calculates the shortest-path distance to
the source node
■ Shortest-path distance (s,v) = minimum number
of edges from s to v, or  if v not reachable from s
● BFS builds breadth-first tree, in which paths to
root represent shortest paths in G
Depth-First Search
● Depth-first search is another strategy for
exploring a graph
■ Explore “deeper” in the graph whenever possible
■ Edges are explored out of the most recently
discovered vertex v that still has unexplored edges
■ When all of v’s edges have been explored,
backtrack to the vertex from which v was
discovered
Depth-First Search
Depth-First Search: The Code
DFS Example
DFS Example
DFS Example
DFS Example
Minimum Spanning Trees
Spanning Tree
● A spanning tree of a graph, G, is a set of |V|-1
edges that connect all vertices of the graph.
Thus a spanning tree for G is a subgraph of G,
T = (V’, E’) with the following properties:
■ V’ = V
■ T is connected
■ T is acyclic.
Problem: Laying Telephone Wire

Central office
Wiring: Naïve Approach

Central office

Expensive!
Wiring: Better Approach

Central office

Minimize the total length of wire connecting the customers


Minimum Spanning Tree
● Problem: given a connected, undirected,
weighted graph:

6 4
5 9

14 2
10
15

3 8
Minimum Spanning Tree
● Problem: given a connected, undirected,
weighted graph, find a spanning tree using
edges that minimize the total weight
6 4
5 9

14 2
10
15

3 8
Minimum Spanning Tree
● Which edges form the minimum spanning tree
(MST) of the below graph?

A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Minimum Spanning Tree
● Answer:

A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Generic MST

A= ф
While A does not form a spanning tree
do find an edge (u, v) that is safe for A
A = A U {(u,v)}
Return A
Kruskal’s Algorithm
Kruskal()
{
T = ;
sort E by increasing edge weight w
for each (u,v)  E (in sorted order)
if adding (u,v) does not form a
cycle)
T = T U {{u,v}};
}
Kruskal’s Algorithm

Run the algorithm:


2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1?
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2? 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5?
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8? 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9?
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13? 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14? 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17?
8 25
5
21 13 1
Kruskal’s Algorithm

2 19?
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21? 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25?
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u  Q
key[u] = ;
key[r] = 0;
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 4
Q = V[G]; 5 9
for each u  Q
key[u] = ; 14
10 2
key[r] = 0;
15
p[r] = NULL;
while (Q not empty) 3 8
u = ExtractMin(Q);
for each v  Adj[u] Run on example graph
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6  4
Q = V[G]; 5 9
  
for each u  Q
key[u] = ; 14
10 2
key[r] = 0; 15
  
p[r] = NULL;
3  8
while (Q not empty)
u = ExtractMin(Q); Run on example graph
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
  

14 2
10
15
r 0  
3  8

Pick a start vertex r


Prim’s Algorithm
6  4
5 9
  

14 2
10
15
u 0  
3  8

Red vertices have been removed from Q


Prim’s Algorithm
6  4
5 9
  

14 2
10
15
u 0  
3 3 8

Red arrows indicate parent pointers


Prim’s Algorithm
6  4
5 9
14  

14 2
10
15
u 0  
3 3 8
Prim’s Algorithm
6  4
5 9
14  

14 2
10
15
0  
3 3 8
u
Prim’s Algorithm
6  4
5 9
14  

14 2
10
15
0 8 
3 3 8
u
Prim’s Algorithm
6  4
5 9
10  

14 2
10
15
0 8 
3 3 8
u
Prim’s Algorithm
6  4
5 9
10  

14 2
10
15
0 8 
3 3 8
u
Prim’s Algorithm
6  4
5 9
10 2 

14 2
10
15
0 8 
3 3 8
u
Prim’s Algorithm
6  4
5 9
10 2 

14 2
10
15
0 8 15
3 3 8
u
Prim’s Algorithm
6  4
u
5 9
10 2 

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
6  4
u
5 9
10 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
6 4 4
u
5 9
10 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
6 4 4
u
5 9
5 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
u
6 4 4
5 9
5 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
u 6 4 4
5 9
5 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
u
6 4 4
5 9
5 2 9

14 2
10
15
0 8 15
3 3 8
Prim’s Algorithm
6 4 4
5 9
5 2 9

14 2 u
10
15
0 8 15
3 3 8
Simplified Algorithm

1. Initialize a tree with a single vertex, chosen


arbitrarily from the graph.
2. Grow the tree by one edge: of the edges that
connect the tree to vertices not yet in the tree,
find the minimum-weight edge, and transfer it
to the tree.
3. Repeat step 2 (until all vertices are in the tree).
Example
● Use Kruskal’s algorithm to find a minimum
spanning tree in the following weighted graph.
Use alphabetical order to break ties.
Example (contd.)
● Solution: Kruskal’s algorithm will proceed as follows.
■ First we add edge {d, e} of weight 1.
■ Next, we add edge {c, e} of weight 2.
■ Next, we add edge {d, z} of weight 2.
■ Next, we add edge {b, e}of weight 3.
■ And finally, we add edge {a, b} of weight 2.
■ This produces a minimum spanning tree of weight 10. A minimum
spanning tree is the following.
Example – Find MST using Prim’s
Algorithm (r = D)
Difference between Kruskal’s and
Prim’s Algorithm
● Prim’s algorithm initializes with a node whereas Kruskal
algorithm initializes with an edge.
● Kruskal's algo builds a minimum spanning tree by
adding one edge at a time. The next line is always the
shortest (minimum weight) ONLY if it does NOT create
a cycle. Prims builds a minimum spanning tree by
adding one vertex at a time. The next vertex to be added
is always the one nearest to a vertex already on the
graph.
● In prim’s algorithm, a tree is maintained in all stages
while in the Kruskal’s algorithm a forest is maintained.
Single-Source Shortest Path

Dijkstra’s Algorithm
Shortest Paths Problems
1
v t
5 4 9
8 13
u 3 x 4 y
2 10
1 1 2
w z
6

● Given a weighted directed graph,


<u,v,t,x,z> is a path of weight 29 from u to z.
<u,v,w,x,y,z> is another path from u to z; it has
weight 16 and is the shortest path from u to z.
Variants of Shortest Paths
Problems
A. Single pair shortest path problem
○ Given s and d, find shortest path from s to d.
B. Single source shortest paths problem
○ Given s, for each d find shortest path from s to d.
(Dijkstra’s algorithm)
C. All-pairs shortest paths problem
○ For each ordered pair s,d, find shortest path. (warshall
Algorithm)
● (A) and (B) seem to have same asymptotic complexity.
Single-Source Shortest Path
● Problem: given a weighted directed graph G,
find the minimum-weight path from a given
source vertex s to another vertex v
■ “Shortest-path” = minimum weight
■ Weight of path is sum of edges
■ E.g., a road map: what is the shortest path from
Chapel Hill to Charlottesville?
Relaxation
● A key technique in shortest path algorithms is
relaxation
■ Idea: for all v, maintain upper bound d[v] on (s,v)
Relax(u,v,w) {
if (d[v] > d[u]+w) then d[v]=d[u]+w;
}
2 2
5 9 5 6

Relax Relax
2 2
5 7 5 6
Dijkstra’s Algorithm
Dijkstra(G)
for each v  V
d[v] = ;
d[s] = 0; S = ; Q = V;
while (Q  )
u = ExtractMin(Q);
S = S U {u};
for each v  Adj[u]
if (d[v] > d[u]+w(u,v))
Relaxation
d[v] = d[u]+w(u,v); Step
Example
u v
1
 

10
9
2 3
s 0 4 6

5 7

 
2
x y
Example
u v
1
10 

10
9
2 3
s 0 4 6

5 7

5 
2
x y
Example
u v
1
8 14

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1
8 13

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1
8 9

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1
8 9

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example

B
10 2
source A 4 3 D
5 1
C
Ex: run the algorithm
All Pairs Shortest Path

Floyd-Warshall Algorithm
Intermediate Vertices
Without loss of generality, we will assume that
V={1,2,…,n}, i.e., that the vertices of the graph
are numbered from 1 to n.

Given a path p=(1,2,…,m) in the graph, we will


call the vertices k with k in {2,…,m-1} the
intermediate vertices of p.

104
Intermediate Vertices
Consider a shortest path p from i to j such that
the intermediate vertices are from the set {1,
…,k}.
● If the vertex k is not an intermediate vertex on
p, then dij(k) = dij(k-1)
● If the vertex k is an intermediate vertex on p,
then dij(k) = dik(k-1) + dkj(k-1)
Interestingly, in either case, the subpaths contain merely nodes
from {1,…,k-1}.
Shortest Path
Therefore, we can conclude that

dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)}

intermediate vertices in {1, 2, …, k}


Recursive Formulation
If we do not use intermediate nodes, i.e., when
k=0, then
dij(0) = wij
If k>0, then
dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)}

107
Floyd-Warshall Algorithm

Input: Digraph G = (V, E), where |V| = n, with edge-weight


function w : E -> W .
We assume that the input is represented by a weight matrix W=
(wij)i,j in E that is defined by
wij= 0 if i=j
wij= w(i,j) if ij and (i,j) in E
wij=  if ij and (i,j) not in E

Output: n × n matrix of shortest-path lengths


(i, j) for all i, j  V.

If the graph has n vertices, we return a distance matrix (d ij),


where dij the length of the path from i to j.
The Floyd-Warshall Algorithm
Floyd-Warshall(W)
n = # of rows of W;
D(0) = W;
for k = 1 to n do
for i = 1 to n do
for j = 1 to n do
dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)};
return D(n);

109
Example
5 1 2 3
D(0)
1 3
1 0 8 5
8
3 2 2 3 0 ∞
2 3 ∞ 2 0

● At each step k dij satisfies the criteria


■ Path begins at i & ends at j
■ Intermediate vertices on the path come from the set {1, 2,
…, k}
Example
5 1 2 3
D(0)
1 3
1 0 8 5
8
3 2 2 3 0 ∞
2 3 ∞ 2 0

● Step 1:- Consider all paths which contain 1 as a


intermediate vertex
■ 2,1,3 is the only path
■ dij(1) = min{dij(0) , dik(0) + dkj(0)}
■ d23(1) = min{d23(0) , d21(0) + d13(0)};
Example
5 1 2 3
D(1)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 ∞ 2 0

● Step 1:- Consider all paths which contain 1 as a


intermediate vertex
■ 2,1,3 is the only path
■ dij(1) = min{dij(0) , dik(0) + dkj(0)}
■ d23(1) = min{d23(0) , d21(0) + d13(0)} = 8
Example
5 1 2 3
D(1)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 ∞ 2 0

● Step 2:- Consider all paths which contain 2 as a


intermediate vertex
■ 3,2,1 is the only path
■ dij(2) = min{dij(1) , dik(1) + dkj(1)}
■ d31(2) = min{d31(1) , d32(1) + d21(1)};
Example
5 1 2 3
D(2)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 2:- Consider all paths which contain 2 as a


intermediate vertex
■ 3,2,1 is the only path
■ dij(2) = min{dij(1) , dik(1) + dkj(1)}
■ d31(2) = min{d31(1) , d32(1) + d21(1)} = 5
Example
5 1 2 3
D(2)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 3:- Consider all paths which contain 3 as a


intermediate vertex
■ 1,3,2 is the only path
■ dij(3) = min{dij(2) , dik(2) + dkj(2)}
■ d12(3) = min{d12(2) , d13(2) + d32(2)};
Example
5 1 2 3
D(3)
1 3
1 0 7 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 3:- Consider all paths which contain 3 as a


intermediate vertex
■ 1,3,2 is the only path
■ dij(3) = min{dij(2) , dik(2) + dkj(2)}
■ d12(3) = min{d12(2) , d13(2) + d32(2)} = 7
Example
5 1 2 3
D(3)
1 3
1 0 7 5
8
3 2 2 3 0 8
2 3 5 2 0

Final Result

● We have obtained the value of an optimal solution.


Transitive closure of a directed graph
● The transitive closure of G is defined as the
graph G* = (V, E*), where E* = { (i, j) : there
is a path from vertex i to j in G}

You might also like