0% found this document useful (0 votes)
4 views

Lecture 3 - Recurrences

The document discusses the divide-and-conquer algorithm paradigm, detailing its three main steps: dividing the problem, conquering subproblems recursively, and combining solutions. It explains how to analyze the running time of algorithms using recurrences and presents various methods for solving these recurrences, including the iteration method, substitution method, recursion tree method, and master method. Examples such as binary search and Fibonacci numbers illustrate the application of these concepts.

Uploaded by

ali2moh.04
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Lecture 3 - Recurrences

The document discusses the divide-and-conquer algorithm paradigm, detailing its three main steps: dividing the problem, conquering subproblems recursively, and combining solutions. It explains how to analyze the running time of algorithms using recurrences and presents various methods for solving these recurrences, including the iteration method, substitution method, recursion tree method, and master method. Examples such as binary search and Fibonacci numbers illustrate the application of these concepts.

Uploaded by

ali2moh.04
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 38

Recurrences

CMPS 211 revision


Divide-and-Conquer
• Divide the problem into a number of subproblems
– Similar sub-problems of smaller size

• Conquer the sub-problems


– Solve the sub-problems recursively

– Sub-problem size small enough  solve the problems in


straightforward manner

• Combine the solutions to the sub-problems


– Obtain the solution for the original problem

3
Analyzing Divide-and Conquer
Algorithms
• The recurrence is based on the three steps of
the paradigm:
– T(n) – running time on a problem of size n
– Divide the problem originally of size n into a
subproblems: takes D(n)
– Conquer (solve) the a subproblems each of which
has size n/b: takes aT(n/b)
– Combine the solutions back C(n)
(1) if n ≤ c
T(n) = aT(n/b) + D(n) + C(n)
otherwise
4
Recurrences
Def.: Recurrence = an equation or inequality
that describes a function in terms of its
value on smaller inputs, and one or more
base cases

E.g.: Fibonacci numbers:


• Recurrence: F(n) = F(n-1) + F(n-2)
• Boundary conditions: F(1) = 0, F(2) = 1
• Compute: F(3) = 1, F(4) = 2, F(5) = 3, and so on
In may cases, the running time of an algorithm is
expressed as a recurrence!
5
BINARY-SEARCH
• for an ordered array A, finds if x is in the array A[lo…hi]

Alg.: BINARY-SEARCH (A, lo, hi, x)


if (lo > hi)
return FALSE
1 2 3 4 5 6 7 8

mid  (lo+hi)/2 a1 a2 a3 a4 a5 a6 a7 a8
if x = A[mid]
mid
return TRUE lo hi
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
6
Example
• A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
– lo = 1 hi = 8 x = 7
1 2 3 4 5 6 7 8

1 2 3 4 5 7 9 11 mid = 4, lo = 5, hi = 8

1 2 3 4 5 7 9 11 mid = 6, A[mid] = x
Found!

7
Example
• A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
– lo = 1 hi = 8 x=6
1 2 3 4 5 6 7 8

1 2 3 4 5 7 9 11 mid = 4, lo = 5, hi = 8

1 2 3 4 5 7 9 11 mid = 6, A[6] = 7, lo = 5, hi = 5

1 2 3 4 5 7 9 11 mid = 5, A[5] = 5, lo = 6, hi = 5
NOT FOUND!

8
Analysis of BINARY-SEARCH
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi) constant time: c1
return FALSE
mid  (lo+hi)/2 constant time: c2
if x = A[mid] constant time: c3
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
same problem of size n/2
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi,same
x) problem of size n/2

• T(n) = c + T(n/2)
– T(n) – running time for an array of size n 9
Recurrences and Running Time
• Recurrences arise when an algorithm contains
recursive calls to itself

• What is the actual running time of the algorithm?


• Need to solve the recurrence
– Find an explicit formula of the expression (the generic
term of the sequence)

10
Example Recurrences
• T(n) = T(n-1) + n Θ(n2)
– Recursive algorithm that loops through the input to
eliminate one item
• T(n) = T(n/2) + c Θ(lgn)
– Recursive algorithm that halves the input in one step
• T(n) = T(n/2) + n Θ(n)
– Recursive algorithm that halves the input but must
examine every item in the input
• T(n) = 2T(n/2) + 1 Θ(n)
– Recursive algorithm that splits the input into 2 halves
and does a constant amount of other work
11
Methods for Solving Recurrences

• Iteration method

• Substitution method

• Recursion tree method

• Master method

12
The Iteration Method
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c + T(n/4)
= c + c + T(n/4) T(n/4) = c +
T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times

= clgn + T(1)
= Θ(lgn)
13
Iteration Method – Example (2)
T(n) = n + 2T(n/2) Assume: n =
2k
T(n) = n + 2T(n/2) T(n/2) = n/2 +
2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlgn + nT(1) = Θ(nlgn)
14
The recursion-tree method

Convert the recurrence into a tree:


– Each node represents the cost incurred at various
levels of recursion
– Sum up the costs of all levels

Used to “guess” a solution for the recurrence

15
Example 1
W(n) = 2W(n/2) + n2

• Subproblem size at level i is: n/2i


• Subproblem size hits 1 when 1 = n/2i  i = lgn
• Cost of the problem at level i = n2/2i No. of nodes at level i = 2i
• Total cost: lg n  1 2
n lg n  1
 1 
i 
 1 
i
1
W ( n)   lg n 2
 2 W (1) n   2 
2
 n n   2 
2
 O(n) n  O ( n)
i 0 2i i 0 i 0 1 1
2
 W(n) = O(n2) 16
Example 2
T(n) = 3T(n/4) + cn2

• Subproblem size at level i is: n/4i


• Cost of a node at level i = c(n/4i)2
• Number of nodes at level i = 3i.
• Cost at level i is 3i c(n2/42i) = c(3/16)in2.
• Subproblem size hits 1 when 1 = n/4i  i = log4n
• Last level has 3log4n = nlog43 nodes.
17
Example 2…
T(n) = 3T(n/4) + cn2

• Total cost:
log4 n  1 i  i
 3 2  3 1
T ( n)     cn   n 
log4 3
  
   cn 2   n log4 3 
3
 
cn 2   n log4 3 O(n 2 )
i 0  16  i 0  16 
1
16
 T(n) = O(n ) 2

18
Example 2 - Substitution
T(n) = 3T(n/4) + cn2
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ dn2, for some d and n ≥ n0
– Induction hypothesis: T(n/4) ≤ d (n/4)2

• Proof of induction goal:


T(n) = 3T(n/4) + cn2
≤ 3d (n/4)2 + cn2
= (3/16) d n2 + cn2
= ((3/16) d + c) n2
≤ d n2 if: (3/16) d + c ≤ d i.e. d ≥ (16/13)c

• Therefore: T(n) = O(n2)

19
Example 2
W(n) = W(n/3) + W(2n/3) +
O(n)
• The longest path from the root to a
leaf is:
n  (2/3)n  (2/3)2 n  …  1
• Subproblem size hits 1 when 1=
(2/3)in  i=log3/2n

• Cost of the problem at level i = n


• Total cost:
log3 / 2 n log3 / 2 n
lg n 1
n  n  ...   n n
i 0

i 0
1 n log 3 / 2 n n 
lg 3 / 2 lg 3 / 2
n lg n

For all levels  W(n) =


O(nlgn) 20
CMPS 256 new material on solving
recurrences
The substitution method

1. Guess a solution

2. Use induction to prove that the


solution works

22
Substitution method
• Guess a solution
– T(n) = O(g(n))
– Induction goal: apply the definition of the asymptotic notation

• T(n) ≤ d g(n), for some d > 0 and n ≥ n0

– Induction hypothesis: T(k) ≤ d g(k) for all k ≤ n

• Prove the induction goal


– Use the induction hypothesis to find some values of the
constants d and n0 for which the induction goal holds

23
Example: Binary Search
T(n) = c + T(n/2)
• Guess: T(n) = (lgn)
• First, show T(n) = O(lgn)
– Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0
– Induction hypothesis: T(n/2) ≤ d lg(n/2)

• Proof of induction goal:


T(n) = T(n/2) + c ≤ d lg(n/2) + c
= d lgn – d + c ≤ d lgn
if: – d + c ≤ 0, d ≥ c 24
Example: Binary Search
T(n) = c + T(n/2)
• Boundary conditions:
– Base case: n0 = 1  T(1) = c has to verify condition:
T(1) ≤ d lg 1  c ≤ d lg 1 = 0 – contradiction
– Choose n0 = 2  T(2) = 2c has to verify condition:
T(2) ≤ d lg 2  2c ≤ d lg 2 = d  choose d ≥ 2c

• We can similarly prove that T(n) = (lgn) and


therefore: T(n) = (lgn)

25
Example 2
T(n) = T(n-1) + n
• Guess: T(n) = (n2)
• We first prove T(n) = O(n2)
– Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
– Induction hypothesis: T(k) ≤ ck2 for all k ≤ n

• Proof of induction goal:


T(n) = T(n-1) + n ≤ c (n-1)2 + n
= cn2 – (2cn – c - n) ≤ cn2
if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 –
1/n)
– For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work 26
Example 2
T(n) = T(n-1) + n
• Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ c (1)2  1 ≤ c  OK!

• We can similarly prove that T(n) = (n2) and


therefore: T(n) = (n2)

27
Example 3
T(n) = 2T(n/2) + n
• Guess: T(n) = (nlgn)
• First prove T(n) = O(nlgn)
– Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ c n/2 lg(n/2)

• Proof of induction goal:


T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0  c ≥ 1

28
Example 3
T(n) = 2T(n/2) + n
• Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ cn0lgn0  1 ≤ c * 1 * lg1 = 0 –
contradiction
– Choose n0 = 2  T(2) = 4 has to verify condition:
T(2) ≤ c * 2 * lg2  4 ≤ 2c  choose c = 2

• We can similarly prove that T(n) = (nlgn) and


therefore: T(n) = (nlgn)
29
Changing variables
T(n) = 2T( n ) + lgn
– Rename: m = lgn  n = 2m
T (2m) = 2T(2m/2) + m
– Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m  S(m) = O(mlgm)
(demonstrated before)
T(n) = T(2m) = S(m) = O(mlgm)=O(lgnlglgn)
Idea: transform the recurrence to one that you
have seen before
30
Revisting an Earlier Recurrence
W(n) = W(n/3) + W(2n/3) + O(n)
• Guess: W(n) = O(nlgn)
– Induction goal: W(n) ≤ dnlgn, for some d and n ≥
n0
– Induction hypothesis: W(k) ≤ d klgk for any K <
n (n/3, 2n/3)
• Proof of induction goal:
Try it out as an exercise!!
• T(n) = O(nlgn)

31
Master’s method
• “Cookbook” for solving recurrences of the form:

 n
T (n) aT    f (n)
b
where, a ≥ 1, b > 1, and f(n) > 0

Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nlogba)

Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn)

Case 3: if f(n) = (nlogba +) for some  > 0, and if

a f(n/b) ≤ c f(n) for some c < 1 and all sufficiently large n, then:

T(n) = (f(n))
regularity condition
32
 n
Why n log a
b ? T (n) aT    f (n)
b

 n • Case 1:
T (n) aT  
b – If f(n) is dominated by nlogba:
 n
a 2T  2  • T(n) = (nlogbn)
b 
 n
a 3T  3 
b  • Case 3:
– If f(n) dominates nlogba:
n
T (n) a iT  i  i • T(n) = (f(n))
b 

• Case 2:
• Assume n = bk  k =
– If f(n) = (nlogba):
logbn
• T(n) = (nlogba logn)
n b iteration
i
• At thelogend
T (n) a T  i  a log nT (1) a log n 
of
b
i = k:
b b n log b a

b 
33
Examples

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare nlog22 with f(n) = n

 f(n) = (n)  Case 2

 T(n) = (nlgn)
34
Examples
T(n) = 2T(n/2) + n2
a = 2, b = 2, log22 = 1
Compare n with f(n) = n2
 f(n) = (n1+) Case 3  verify regularity cond.
a f(n/b) ≤ c f(n)
 2 n2/4 ≤ c n2  c = ½ is a solution (c<1)
 T(n) = (n2)

35
Examples (cont.)

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare n with f(n) = n1/2

 f(n) = O(n1-) Case 1

 T(n) = (n)
36
Examples

T(n) = 3T(n/4) + nlgn

a = 3, b = 4, log43 = 0.793

Compare n0.793 with f(n) = nlgn

f(n) = (nlog43+) Case 3

Check regularity condition:

3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n),


c=3/4
37
Examples

T(n) = 2T(n/2) + nlgn

a = 2, b = 2, log22 = 1

• Compare n with f(n) = nlgn


– seems like case 3 should apply

• f(n) must be polynomially larger by a factor of n 

• In this case it is only larger by a factor of lgn

38

You might also like