0% found this document useful (0 votes)
12 views53 pages

Co SC 3131 Chap 3

Uploaded by

hagosabate9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views53 pages

Co SC 3131 Chap 3

Uploaded by

hagosabate9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 53

Chapter Three

Greedy Model

1
Greedy method
 Is one of the strategies we can use to solve problems.
 Like divide and conquer and other strategies we can use this
approach to solve different problems (it is a design we can
adopt to solve similar problems like in other strategies).
 Is used for solving optimization problems.
 Optimization problems are problems which requires either a
minimum result or maximum result.
2
Example
 If you want to go from place A to place B (A->B). You can have different solutions.
1. You can go by walking
2. You can use bicycle
3. You can use car
4. You can use plane
5. You can use train
6. etc…
 But there are constraints/conditions that you have to obey. Suppose you have to get to place B with in 6
hrs.
 Suppose if you use walking, bicycle and car you will need more than 6 hrs. but, if you use plane or train
you can get there with in 6 hrs.
 So, using plane or train is a feasible solutions for this problem.
 What if I want to get to place B with minimum cost too. In this case, suppose using train will cost less than
using plane. Therefore, using train will be the optimal solution.(minimization problem)

3
Greedy method Cont…
For every problem, you can have only one optimal solution.
In greedy method a problem should be solved in stages.
In each stage we consider one input from a give problem.
And if the input is feasible, we will include it in the solution

4
Greedy algorithm
 A greedy algorithm always makes the choice that looks best at the moment.
 Greedy algorithms do not always yield optimal solutions, but for many problems
they do.
 The greedy method is quite powerful and works well for a wide range of problems.
 Greedy algorithms build up a solution piece by piece, always choosing the next piece
that offers the most obvious and immediate benefit.
 Although such an approach can be disastrous for some computational tasks, there are
many for which it is optimal.

5
Greedy method Cont…
 Example
Suppose you want to buy a car. Your constraint is a car with a best
features.
Suppose you want to hire some peoples

6
Greedy algorithm
 Disadvantages:

They do not always work.


Short term choices may be disastrous on the long term.
Correctness is hard to prove

 Advantages:

When they work, they work fast


Simple and easy to implement
7
Job sequencing with deadlines
 The arrangement of job on single processor with deadline constraint is
named as job sequencing with deadline.
 Here:
 You are given a set of jobs.
 Each job has a defined deadline and some profit associated with it.
 The profit of a job is given only when that job is completed within its deadline.
 Only one processor/machine is available for processing all the jobs.
 Processor takes one unit of time to complete the job.

8
Job sequencing with deadlines Cont.…
 We earn the profit if and only if the job is completed by its deadline
 “Profit” can be the priority of the task in a real time system that discards tasks that cannot
be completed by their deadline
 We want to find the subset of jobs that maximizes our profit
 This is a restricted version of a general job scheduling problem, which is an integer
programming problem
 Example use in telecom engineering and construction scheduling
 Many small jobs, “profit” proportional to customers served
 This is then combined with integer programming solution for big jobs
 Greedy also used in how many machines/people problems
9
Job sequencing with deadlines Cont.…

 Problem: Consider the following set of jobs with its profit fraction obtained by executing
it and the deadline (time in ms), in which each respective job must be completed.
 It is assumed that only one machine is available for processing jobs and to complete a job,
only one has to process the job on a machine for one unit of time.
 Deadline should not be confused with the processing time.
 In this problem, processing time for each job is 1ms and deadline is the time which limits
the processor's turnaround time to process the job.
 If the turnaround time is greater than the prescribed deadline, the job cannot be carried
anymore.

10
Job sequencing with deadlines Cont.…
 The problem states:
 “How can the total profit be maximized if only one job can be completed at a time?”

 Approach to Solution:
 A feasible solution would be a subset of jobs where each job of the subset gets
completed within its deadline.
 Value of the feasible solution would be the sum of profit of all the jobs contained in
the subset.
 An optimal solution of the problem would be a feasible solution which gives the
maximum profit.

11
Job sequencing with deadlines Cont.…
 Greedy Algorithm:
 Greedy Algorithm is adopted to determine how the next job is selected for an optimal solution.
 The greedy algorithm described below always gives an optimal solution to the job sequencing problem:
 Step-01:

 Sort all the given jobs in decreasing order of their profit.

 Step-02:

 Check the value of maximum deadline.

 Draw a Gantt chart where maximum time on Gantt chart is the value of maximum deadline.

12
Job sequencing with deadlines Cont.…

 Step-03:

Pick up the jobs one by one.


Put the job on Gantt chart as far as possible from 0 ensuring that the job gets
completed before its deadline.

 Example: we have given the jobs, their deadlines and associated


profits as shown below:
Jobs J1 J2 J3 J4 J5 J6

Deadlines 5 3 3 2 4 2

Profits 200 180 190 300 120 100

13
Example Cont.…

 Answer the following questions-


1. Write the optimal schedule that gives maximum profit.
2. Are all the jobs completed in the optimal schedule?
3. What is the maximum earned profit?
 Solution:

 Step-01:

Sort all the given jobs in decreasing order of their profit-


14
Example Cont.…
Jobs J4 J1 J3 J2 J5 J6

Deadlines 2 5 3 3 4 2

Profits 300 200 190 180 120 100

Step-02:
Value of maximum deadline = 5.
So, draw a Gantt chart with maximum time on Gantt chart = 5 units as shown:

15
Example Cont.…

 Now,
We take each job one by one in the order they appear in Step-01.
We place the job on Gantt chart as far as possible from 0.
 Step-03:
We take job J4.
Since its deadline is 2, so we place it in the first empty cell before
deadline 2 as:

 Step-04:
 We take job J1.
Since its deadline is 5, so we place it in the first empty cell before
deadline 5 as
16
Example Cont.…

Step-05:
We take job J3.
Since its deadline is 3, so we place it in the first empty cell before
deadline 3 as:

 Step-06:
We take job J2.
Since its deadline is 3, so we place it in the first empty cell before
deadline 3.
Since the second and third cells are already filled, so we place job
J2 in the first cell as:
17
Example Cont.…
 Step-07:
 Now, we take job J5.
 Since its deadline is 4, so we place it in the first empty cell
before deadline 4 as:

18
Example Cont.…

 Now,
 The only job left is job J6 whose deadline is 2.

 All the slots before deadline 2 are already occupied.

 Thus, job J6 cannot be completed.

 Now, the given questions may be answered as:


 Part-01: Answer for Question 1
 The optimal schedule is:

 J2 , J4 , J3 , J5 , J1

 This is the required order in which the jobs must be completed in order to obtain the
maximum profit.
19
Example Cont.…
 Part-02: Answer for Question 2
All the jobs are not completed in optimal schedule.
This is because job J6 could not be completed within its deadline.
 Part-03: Answer for Question 3
 Maximum earned profit = Sum of profit of all the jobs in optimal
schedule
= Profit of job J2 + Profit of job J4 + Profit of job J3 + Profit of job
J5 + Profit of job J1
= 180 + 300 + 190 + 120 + 200
=990

20
Job sequencing with deadlines Example

 Consider the following example:


Jobs J0 J1 J2 J3 J4 J5
Profit 10 34 67 45 23 97
Deadline 2 3 1 4 5 3
(in ms)

 In this problem, an optimal solution is a feasible solution


which gives maximum value of profits by executing jobs.
21
Solution
 Step 1) Re-arrange jobs in non increasing order of profits.
 Jobs J5 J2 J3 J1 J4 J0

 Profit 97 67 45 34 23 10

 Deadline 3 1 4 3 5 2

 (in ms)

 Step 2) Let J be a set of jobs that can be completed by their deadline, is an optimal solution.
 Starting from highest profit earning job, the value of J can be initialized as J={J5}.
 Step 3) Now, at this step we must formulate an optimization measure to determine how the next job is
chosen.
 The next higher profit earning job is chosen i.e. J2 with the deadline as 1ms.
 At this point we cannot process the job as 1ms has already been spent in processing job J5.

22
Solution
If the previous job executed has a greater deadline than the
current one analyzing , the issue can be resolved.
Since, in our example the preceding job in solution set has a
greater deadline limit, this means by re-arranging the order
of processing of jobs in solution set ,we can execute both the
jobs J5 and J2.
So the new optimized solution set obtained is J={J2,J5}.
23
Optimal merge pattern

 Optimal merge pattern is a pattern that relates to the merging of two or more sorted
files in a single sorted file.
 This type of merging can be done by the two-way merging method.
 If we have two sorted files containing n and m records respectively then they could
be merged together, to obtain one sorted file in time O (n+m).
 There are many ways in which pairwise merge can be done to get a single sorted file.
 Different pairings require a different amount of computing time.
 The main thing is to pairwise merge the n sorted files so that the number of
comparisons will be less.

24
Optimal merge pattern Cont.…
The formula of external merging cost is:

Where, f (i) represents the number of records in each file


and d (i) represents the depth.

25
Algorithm for optimal merge pattern
Algorithm Tree(n)
//list is a global list of n single node
{
For i=1 to i= n-1 do
{
// get a new tree node
Pt: new treenode;
// merge two trees with smallest length
(Pt = lchild) = least(list);
(Pt = rchild) = least(list);
(Pt =weight) = ((Pt = lchild) = weight) = ((Pt = rchild) = weight);
Insert (list , Pt);
}
// tree left in list
Return least(list);
}
26
Algorithm for optimal merge pattern
 An optimal merge pattern corresponds to a binary merge tree with minimum weighted
external path length.
 The function tree algorithm uses the greedy rule to get a two- way merge tree for n files.
 The algorithm contains an input list of n trees.
 There are three fields lchild, rchild, and weight in each node of the tree.
 Initially, each tree in a list contains just one node.
 This external node has lchild and rchild field zero whereas weight is the length of one of
the n files to be merged.
 For any tree in the list with root node t, t = it represents the weight that gives the length of
the merged file.

27
Algorithm for optimal merge pattern
 There are two functions least (list) and insert (list, t) in a function tree.

 Least (list) obtains a tree in lists whose root has the least weight and return a pointer to this tree.

 This tree is deleted from the list.

 Function insert (list, t) inserts the tree with root t into the list.

 The main for loop in this algorithm is executed in n-1 times.

 If the list is kept in increasing order according to the weight value in the roots, then least (list) needs only O(1) time and

insert (list, t) can be performed in O(n) time.

 Hence, the total time taken is O (n2).

 If the list is represented as a minheap in which the root value is less than or equal to the values of its children, then least

(list) and insert (list, t) can be done in O (log n) time.

 In this condition, the computing time for the tree is O (n log n).

28
Example
 Given a set of unsorted files: 5, 3, 2, 7, 9, 13
Now, arrange these elements in ascending order: 2, 3, 5, 7, 9, 13
After this, pick two smallest numbers and repeat this until we left
with only one number.
 Now follow following steps:
Step 1: Insert 2, 3

Step 2:

29
Example Cont…
Step 3: Insert 5

Step 4: Insert 13

30
Example Cont…
Step 5: Insert 7 and 9
Step 6: merging both:

So, The merging cost = 5 + 10 + 16 + 23 + 39 = 93


31
Optimal merge pattern Cont.…
 When more than two sorted files are to be merged together the merge
can be accomplished by repeatedly merging sorted files in pairs.
 If files Xl, X2, X3 and X4 are to be merged we could first merge Xl
and X2 to get a file Yl.
 Then we could merge Yl and X3 to get Y2.
 Finally, Y2 and X4 could be merged to obtain the desired sorted file.

32
Optimal merge pattern Cont..
 Alternatively, we could first merge Xl and X2 getting Yl, then merge X3
and X 4 getting Y2 and finally Y1 and Y2 getting the desired sorted file.
 Given n sorted files there are many ways in which to pairwise merge
them into a single sorted file.
 Different pairings require differing amounts of computing time.
 The problem we shall address ourselves to now is that of determining an
optimal (i.e. one requiring the fewest comparisons) way to pairwise
merge n sorted files together.
33
Example
 Xl, X2 and X3 are three sorted files of length 30, 20 and 10 records each.
 Merging Xl and X2 requires 50 record moves.
 Merging the result with X3 requires another 60 moves.
 The total number of record moves required to merge the three files this
way is 110.
 Y1 = X1+X2 = 30+20=50 moves

 Y2 = Y1+ X3 = 50 + 10=60 moves

 Total number of record moved = 50+60 = 110

34
Example Cont.…
 If instead, we first merge X2 and X3 (taking 30 moves) and
then Xl (taking 60 moves), the total record moves made is
only 90.
 Hence, the second merge pattern is faster than the first.
Z1 = X2+X3=20+10=30 moves
Z2=Z1+X1=30+30=60 moves
Total number of record moved = 30+60=90 moves
35
Minimum spanning trees
 A graph can be represented by: G=(V,E)
 A spanning tree of the above graph can be represented by: G’=(V’,E’)
 What is the relation between the graph and its spanning tree?
 V’=V (every vertex in the graph should be in the spanning tree)
 E’ c E (E’= |V|-1)
 A spanning tree:
 removes a cycle edge

 We cannot disconnect a graph.

 So the solution must be connected and acyclic.

36
What is minimum spanning trees
 The particular tree the one with minimum total weight, known as the
minimum spanning tree.
 Suppose the graph G=(V,E) have edge weight, a spanning tree with a
minimum edge weight is called minimum spanning tree.

Graph spanning trees


37
Minimum spanning tree
 Minimum spanning trees:

Graph minimum spanning trees

38
Minimum spanning trees

 Let G = ( V, E) be an undirected connected graph.


 A subgraph T = ( V, E') of G is a spanning tree of G iff T is a
tree.
 The following figure shows the complete graph on 4 nodes
together with three of its spanning trees.

39
Properties of minimum spanning tree

 Removing one edge from spanning tree will make it disconnected.


 Adding one edge to a ST will create a loop.
 if each edge has distinct weight then, there will be only one and unique
MST
 A complete undirected graph can have nn-2 number of ST.
 Every connected and undirected graph has at least one ST.
 Disconnected graph does not have ST.
 From a complete graph by removing max(e-n+1) we can construct a ST:
EX e=3 n/V= 3 possible ST= 3-3+1=1

 So we can remove maximum of 1 edge from the above graph


40
Minimum spanning tree using prim’s and Kruskal’s algorithm

Prim’s algorithm is a “greedy” algorithm


Greedy algorithms find solutions based on a sequence of choices
which are “locally” optimal at each step.
Nevertheless, Prim’s greedy strategy produces a globally optimum
solution!
The edges in set A always form a single tree
Starts from an arbitrary “root”: VA = {a}
At each step:
 Find a light edge crossing (VA, V - VA)
 Add this edge to A
 Repeat until the tree spans all vertices
41
Minimum spanning tree using prim’s and Kruskal’s algorithm
Kruskal’s algorithm: How is it different from Prim’s
algorithm?
Prim’s algorithm grows one tree all the time
Kruskal’s algorithm:
Grows multiple trees (i.e., a forest) at the same time.
Trees are merged together using safe edges
Since a MST has exactly |V| - 1 edges, after |V| - 1 merges, we
would have only one component

42
Single source shortest pattern
 Graphs may be used to represent the highway structure of a
state or country with vertices representing cities and edges
representing sections of highway.
 The edges may then be assigned weights which might be
either the distance between the two cities connected by the
edge or the average time to drive along that section of
highway.
43
Single source shortest pattern Cont.…

A motorist wishing to drive from city A to city B would be


interested in answers to the following questions:
i. Is there a path from A to B?
ii. If there are more than one paths from A to B, which is the
shortest path?

 The problems defined by (i) and (ii) above are special cases
of the path problem we shall be studying in this section.
44
Single source shortest pattern Cont.…
 The length of a path is now defined to be the sum of the weights of the edges on that
path.
 The starting vertex of the path will be referred to as the source and the last vertex the
destination.
 The graphs will be digraphs to allow for one way streets.
 In the problem we shall consider, we are given a directed graph G = ( V, E ), a
weighting function c(e) for the edges of G and a source vertex vo.
 Problem is to determine the shortest paths from vo to all the remaining vertices of G.
 It is assumed that all the weights are positive.

45
Single source shortest pattern Example:

If vo is the source vertex, then the shortest path from vo to v1 is vo v2 v3 v1.


The length of this path is 10 +15 + 20 = 45.
Even though there are three edges on this path, it is shorter than the path vov1
which is of length 50. There is no path from vo to v5.

46
Dijkstra algorithm Single source shortest pattern
 If a directed graph is given we have to find a shortest path from some
starting vertex to all the vertices.
 As we have to find a shortest path it is minimization problem.

 Lets say we select vertex1 as starting point then we have to find all the
shortest paths to all the vertices.
 The path can be direct or through other vertices.
47
Dijkstra algorithm Cont.…
For the direct paths always we have to select the least
shortest path.
For the paths which goes through other vertices relaxations
will be applied to update the weights.
Eg.

Whenever you find shortest path, you relax the vertices.

48
Dijkstra example
 Step1: Consider vertex 1 as starting
/source vertex
 Step2: Give the distances for all the
vertices.
From 1 to 1 distance=0
From 1 to 2 d=2
From 1 to 3 distance=4
From 1 to 4 distance=∞(there is no direct path from 1 to 4)
From 1 to 5 distance=∞(there is no direct path from 1 to 5)
From 1 to 6 distance=∞(there is no direct path from 1 to 6)
49
Dijkstra example
Step3: is a repetition. We have to select a shortest
path/distance from all the paths.
A path from 1 to 2 is the shortest, so we select it. Once we
select path from 1 to 2 we need to perform relaxation.
Vertices 3 and 4 are connected to vertex 2. Now, a distance
from vertex 2 to 3 is 1 and distance from vertex 1 to 2 is 2.
add together the distances, (1+2=3) and the result is less than
the distance of vertex 1 to vertex 3 which is 4. In this case
we need to perform relaxation. Now, we have to modify the
distance from vertex 1 to vertex 3 to 3 by passing through
vertex 2.
50
Dijkstra algorithm complexity
Time complexity is O(n2) since we have to find the path for
all the vertices and relax all the vertices in the case of
complete graph.

51
Draw back
Consider the following example
Dijkstra does not considered negative weights
What if the distance between vertex 3 and vertex
2 is -10?

52
Example 2
Starting vertex Vo
Selected vertices v1 v2 v3 v4 v5
v2 50 10 ∞ 45 ∞
v3 50 10 25 45 ∞
v1 45 10 25 45 ∞
v4 45 10 25 45 ∞
∞ 45 10 25 45 ∞

53

You might also like