0% found this document useful (0 votes)
3 views

Greedy Algorithm 3

The document provides an introduction to greedy algorithms, focusing on Kruskal's algorithm for minimum spanning trees and Huffman encoding for data compression. It discusses the coin changing problem, explaining how to use a greedy approach to minimize the number of coins needed for a given amount, along with examples and properties of optimal solutions. Additionally, it touches on activity scheduling as another application of greedy algorithms.

Uploaded by

zhihengzhou0419
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Greedy Algorithm 3

The document provides an introduction to greedy algorithms, focusing on Kruskal's algorithm for minimum spanning trees and Huffman encoding for data compression. It discusses the coin changing problem, explaining how to use a greedy approach to minimize the number of coins needed for a given amount, along with examples and properties of optimal solutions. Additionally, it touches on activity scheduling as another application of greedy algorithms.

Uploaded by

zhihengzhou0419
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 77

Introduction to Algorithms

Class 11: Greedy Algorithm 3

Jianxi Gao

Department of Computer Science


Rensselaer Polytechnic Institute
www.gaojianxi.com
Kruskal’s algorithm
Procedure Kruskal(𝐺(𝑉, 𝐸), 𝑤) Procedure find(𝑥)(find the root of x)
for all vertex v ∈ 𝑉 While 𝑥 ≠ 𝜋 𝑥 : 𝑥 = 𝜋 𝑥
makeset(v) Return 𝑥
𝑋 = {}. Procedure union 𝑥, 𝑦
Sort the edges of 𝐸 by weights. 𝑟𝑥 = 𝑓𝑖𝑛𝑑 𝑥
For all edges u, v ∈ 𝐸, in 𝑟𝑦 = 𝑓𝑖𝑛𝑑 𝑦
increasing order of weight If 𝑟𝑦 = 𝑟𝑥 return
If find u ≠ 𝑓𝑖𝑛𝑑 𝑣 : If rank(𝑟𝑥 ) > 𝑟𝑎𝑛𝑘 𝑟𝑦
Add edge {u, v} to 𝑋. 𝜋 𝑟𝑦 = 𝑟𝑥
Union(u, v). Else:
𝜋 𝑟𝑥 = 𝑟𝑦
Procedure makeset(𝑥)
𝜋 𝑥 = 𝑥 (parent pointer) If rank(𝑟𝑥 ) = 𝑟𝑎𝑛𝑘 𝑟𝑦 :
rank 𝑥 = 0 𝑟𝑎𝑛𝑘 𝑟𝑦 = 𝑟𝑎𝑛𝑘 𝑟𝑦 +1
Huffman encoding -- example
letter Frequency Code
2 3 4 4 7 A 4
D C A E B
B 7
C 3
D 2
E 4
Huffman encoding -- example
letter Frequency Code
2 3 4 4 7 A 4
D C A E B
B 7
C 3
D 2
E 4
Huffman encoding -- example
letter Frequency Code
20
A 4 00
0 1
B 7 11
8 12 C 3 101
0 1 0 1
D 2 100
4 4 5 7 E 4 01
A E 0 1 B
2 3
D C
Table size = 5*8 + 2+2+3+3+2 = 52

Message size = 4*2+7*2+3*3+2*3+4*2 = 45

Total size = Table size + Message size = 97


Greedy Algorithms

A g r eed y a lg o r ith m is an algorithm that constructs


an object X one step at a time, at each step choosing
the locally best option.
I n some cases, greedy algorithms construct
the globally best object by repeatedly
choosing the locally best option.

Dijkstras’ algorithm and the two minimum spanning tree algorithms


are greedy algorithms
Coin changing
Suppose you go to the market to buy a box of cookie. The price of the
cookie is $9.75. How much you will get if you give a $10 bill?

Q1: Why do you always receive 1 quarter instead of 25 pennies?


A1: fewest number of coins
Coin changing
Goal: Given currency denominations: 1¢, 5¢, 10¢, 25¢, $1,
devise a method to pay amount to customer using fewest
number of coins.

Example: 34¢.

Cashier's algorithm. At each iteration, add coin of the largest


value that does not take us past the amount to be paid.

Example: $2.89
General problem statement
Given coin denominations in C = {ci} (where 𝑖 ∈ [0, 𝑚 − 1]), make
change for a given amount A with the minimum number of corns.
Input: C = {ci} and A
Output: number of corns.

Example: We have infinite supply of C = {1¢, 5¢, ¢10, 25¢, $1}


valued coins. We want to buy a product of value A = $2.57, what
is minimum number of coins needed to pay?
What is your algorithm?
I guess you should give me the right solution:
2 X $1 + 2 X 25¢ + 1 X $5 + 2 X $1, (7 corns)

What is the answer if you buy another product with value $3.33?
Greedy algorithm.

Answer: 3 X $1 + 1 X 25¢ + 1 X $5 + 3 X $1, (8 corns)


Greedy algorithm

1) Initialize result as empty S = {}.


2) Find the largest denomination that is smaller than A or equal
to A, Cm.
3) Add found denomination to S. Subtract value of found
denomination from V.
4) If V becomes 0, return the length of S. Else repeat steps 2
and 3 for new value of A.

Example: We have infinite supply of C = {$1, $5, $10, $20, $100}


valued coins. We want to buy a product of value A = $257, what
is minimum number of coins/notes needed to pay?
Coin changing: Greedy Algorithm

Sort coins denominations by value: c 0 < c1 < … < cm-1.

coins selected

S  
while (A  0) {
let k be largest integer such that ck  A
if (no such k)
return "no solution found"
A  A - ck
S  S  {k}
}
return S

A: Yes, greedy algorithms are always optimal.


Q. Is cashier's algorithm optimal? B: Yes, for any denominations c1, c2, …, cn with c1 = 1
C: Yes, because of special properties of US coins
D: No.
Properties of optimal solutions
Property. Number of pennies ≤ 4.
Pf. Replace 5 pennies with 1 nickel.
Property. Number of nickels ≤ 1.
Pf. Replace 2 nickels with 1 Dime.
Property. Number of quarters ≤ 3.
Pf. Replace 4 quarters with 1 Dollar.
What is the property of Dime?
Property. Number of nickels + number of dimes ≤ 2.
Pf. Replace 3 dimes and 0 nickels with 1 quarter and 1 nickel;
Replace 2 dimes and 1 nickel with 1 quarter.
Recall: at most 1 nickel.
Coin Changing: Analysis of Greedy Algorithm
Theorem. Greedy is optimal for U.S. coinage: 1, 5, 10, 25, 100.
Pf. (by induction on A)
– Consider optimal way to change ck  A < ck+1 : greedy takes coin k.
– We claim that any optimal solution must also take coin k.
• if not, it needs enough coins of type c 1, …, ck-1 to add up to A
• table below indicates no optimal solution can do this
– Problem reduces to coin-changing A - ck cents, which, by induction, is optimally solved by greedy
algorithm.

All optimal solutions Max value of coins


k ck
must satisfy 1, 2, …, k-1 in any OPT
1 1 P4 -
2 5 N1 4
3 10 N+D2 4+5=9
4 25 Q3 20 + 4 = 24
5 100 no limit 75 + 24 = 99
Coin Changing: Analysis of Greedy Algorithm

Q1: What is the smallest number of quarters, dimes, nickels and pennies one can carry
while still being able to give perfect change (two decimals)?
99¢ = 3 × 25¢ + 2 × 10¢ + 1 × 5¢ + 4 × 1¢
Q2: What amount of change requires the largest number of coins?
99¢ = 3 × 25¢ + 2 × 10¢ + 1 × 5¢ + 4 × 1¢
Q3: Can you answer the question for a different system of coins? For example, I am
currently spending the summer in Cambridge, England, where coins are worth 1, 2, 5,
10, 20, and 50 pence. What if you also include the 1- and 2-pound (100 and 200
pence) coins, and want to be able to make change for every amount up to 5 pounds
(the smallest note)?
499p = £2 + £2 + 50p + 20p + 20p + 5p + 2p + 2p = 8 coins.
Q4. If you got to design your own system of coins with whatever denominations
you wanted, how would you design it so that the minimum number of coins
needed to make all amounts between 1 and 99 cents is as small as possible?
98 = 27+27+27+9+3+3+1+1 (8 coins total).
10 mins
Cashier’s algorithm for other denominations

Q: Is cashier’s algorithm optima for any set of denominations?

A: No.
Example 1: Consider U.S. postage: 1, 10, 21, 34, 70, 100, 350, 1225, 1500

You need 140¢?

?
Cashier’s algorithm: 140 = 100 +34 +1 +1 +1 +1 +1
The optimal: 140 = 70+70
Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.
Cashier’s algorithm for other denominations

Q: Is cashier’s algorithm optima for any set of denominations?

A: No.
Example 1: Consider U.S. postage: 1, 10, 21, 34, 70, 100, 350, 1225, 1500

You need 140¢?

?
Cashier’s algorithm: 140 = 100 +34 +1 +1 +1 +1 +1
The optimal: 140 = 70+70
Example 2: It may not even lead to a feasible solution. For example, c k = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7 6

1 0 0
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7 6
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7 6

1
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7 6

1 0
Other algorithms

Example 2: It may not even lead to a feasible solution. For example ck = 7,8,9
Cashier’s algorithm: 15 = 9 + ??? The optimal: 15 = 7+8.

15

8 7 6

1 0 0

We will learn all the related algorithms in the dynamic programming chapter.
Another Problem:
A ctivity S c h ed ulin g
Problem statement

Given n activities with their start and finish times. Select the
maximum number of activities that can be performed by a single
person, assuming that a person can only work on a single
activity at a time.
Activity Scheduling
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Activity Scheduling: Not feasible
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Activity Scheduling
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Activity Scheduling
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Greedy algorithms

• E a rl i e s t S t a r t : Choose activities in ascending order of start times.

• Sh or te st F i rst : Choose activities in ascending order of length.

• F in is h Fa s t: Choose activities in ascending order of end times.

• F ewest Conflict s: Choose activities starting with ones with fewest


conflicts.
Activity Scheduling
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

B o n fire

Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

B a r C r a wling
Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

B o n fire

Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

B a r C r a wling
Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g N ig h t S n ork elin g

B o n fire

N a vel G a zin g Ja zz C on cert


Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g N ig h t S n ork elin g

B o n fire

N a vel G a zin g Ja zz C on cert


Earliest start
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g

B o n fire

N a vel G a zin g
Shortest first
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Shortest first
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Shortest first
3 4 5 6 7 8 9 10 11 12 1

S a ls a D a n cin g N ig h t S n ork elin g

B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

B a r C r a wling
Shortest first
3 4 5 6 7 8 9 10 11 12 1

S a ls a D a n cin g N ig h t S n ork elin g

B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

B a r C r a wling
Shortest first
3 4 5 6 7 8 9 10 11 12 1

G a rd en in g Fa n cy D in n er
Greedy algorithms

• E a rl i e s t S t a r t : Choose activities in ascending order of start times.

• Sh or te st F i rst : Choose activities in ascending order of length.

• F in is h Fa s t: Choose activities in ascending order of end times.

• F ewest Conflict s: Choose activities starting with ones with fewest


conflicts.

10 mins
Finish fast
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Fewest conflict
3 4 5 6 7 8 9 10 11 12 1

L la m a H u g g in g S a ls a D a n cin g N ig h t S n ork elin g

S k yd ivin g B o n fire

G a rd en in g Fa n cy D in n er

N a vel G a zin g Ja zz C on cert

Tree C l im bin g B a r C r a wling

E ven in g H ik e
Greedy algorithms

• A g r eed y a l g o r i th m is an algorithm that constructs an


object X one step at a time, at each step choosing the locally
best option.
• I n some cases, greedy algorithms construct the globally
best object by repeatedly choosing the locally best
option.
Greedy advantages

Greedy algorithms have several advantages over other


algorithmic approaches:
• S im p l ic ity: Greedy algorithms are often easier to describe
and code up than other algorithms.
• E ffic ien c y: Greedy algorithms can often be implemented more
efficiently than other algorithms.
Greedy Challenges

Greedy algorithms have several drawbacks:


• H a r d to d e sig n : Once you have found the right greedy
approach, designing greedy algorithms can be easy.
H owever, finding the right approach can be hard.
• H a r d to ver ify: Sh owing a greedy algorithm is correct often
requires a nuanced argument.
Interval partitioning

• Interval partitioning.
– Lecture j starts at sj and finishes at fj.
– Goal: find minimum number of classrooms to schedule all lectures so that no two
occur at the same time in the same room.

• Ex: This schedule uses 4 classrooms to schedule 10 lectures.

Q: Can you find better one?

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning

• Interval partitioning.
– Lecture j starts at sj and finishes at fj.
– Goal: find minimum number of classrooms to schedule all lectures so that no two
occur at the same time in the same room.

• Ex: This schedule uses only 3.

c d f j

b g i

a e h

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning

Greedy template. Consider lectures in some natural order. Assign each


lecture to an available classroom (which one?); allocate a new
classroom if none are available.

• [Earliest start time] Consider lectures in ascending order of sj.

• [Earliest finish time] Consider lectures in ascending order of fj.

• [Shortest interval] Consider lectures in ascending order of fj – sj.


• [Fewest conflicts] For each lecture j, count the number of
conflicting lectures cj. Schedule in ascending order of cj.
Interval partitioning: earliest start time

EARLIEST-START-TIME-FIRST (n, s1, s2, …, sn , f1, f2, …, fn)

SORT lectures by start time so that s1 ≤ s2 ≤ … ≤ sn.


d←0 number of allocated classrooms

FOR j = 1 TO n
IF Lecture j is compatible with some classroom
Schedule lecture j in any such classroom k.
ELSE
Allocate a new classroom d + 1.
Schedule lecture j in classroom d + 1.
d←d +1
RETURN schedule.
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

a d

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e

a d

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e

a d f

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e

b g

a d f

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e h

b g

a d f

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e h

b g

a d f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: earliest start time

e j

c d g

b h

a f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time

c e h

b g j

a d f i

9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30 3 3:30 4 4:30


Time
Interval partitioning: analysis of earliest start time

Observation. The earliest start time algorithm never schedules two


incompatible lectures in the same classroom.

Theorem. Earliest-start-time-first algorithm is optimal.


Pf.
・ Let d = number of classrooms that the algorithm allocates.
・ Classroom d is opened because we needed to schedule a lecture, say j, that is
incompatible with all d – 1 other classrooms.
・ These d lectures each end after sj.
・ Since we sorted by start time, all these incompatibilities are caused by
lectures that start no later than sj.
・ Thus, we have d lectures overlapping at time sj + 𝜀 .

・ Key observation ⟹ all schedules use ≥d classrooms.


Interval partitioning: analysis of earliest start time

Proposition. The earliest-start-time-first algorithm can be implemented in


O(n log n) time.

Pf. Store classrooms in a priority queue (key = finish time of its last lecture).
・To determine whether lecture j is compatible with some classroom,
compare sj to key of min classroom k in priority queue.
・ To add lecture j to classroom k, increase key of classroom k to fj.
・Total number of priority queue operations is O(n).
・Sorting by start time takes O(n log n) time.

Remark. This implementation chooses the classroom k whose finish time


of its last lecture is the earliest.
Frog Jumping

0 1 2 3 4 5 6 7 8 9 10
Frog Jumping

0 1 2 3 4 5 6 7 8 9 10
Frog Jumping

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 3
Frog Jumping

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 3
Frog Jumping

The frog begins at position 0 in the river. I ts goal is to get


to position n.

There are lilypads at various positions. There is always


a lilypad at position 0 and position n.

The frog can jump at most r units at a time.

G o al: Find the path the frog should take to minimize


jumps, assuming a solution exists.

Q: What is the condition a solution exists?

A: The distance of two adjacency lilypads is not more than r.


Frog Jumping

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 3
Frog Jumping

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 3
As a graph

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 3
A leap of faith

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 2
Algorithm: Always
jump as far forward
as possible.
A leap of faith

0 1 2 3 4 5 6 7 8 9 10

M a x ju m p s ize: 4
Algorithm: Always
jump as far forward
as possible.

You might also like