Ad Endsem Imp
Ad Endsem Imp
Master Theorem
The Master Theorem applies to recurrences of the following form:
where a ≥ 1 and b > 1 are constants and f (n) is an asymptotically positive function.
There are 3 cases:
1. If f (n) = O(nlogb a− ) for some constant > 0, then T (n) = Θ(nlogb a ).
2. If f (n) = Θ(nlogb a logk n) with1 k ≥ 0, then T (n) = Θ(nlogb a logk+1 n).
3. If f (n) = Ω(nlogb a+ ) with > 0, and f (n) satisfies the regularity condition, then T (n) = Θ(f (n)).
Regularity condition: af (n/b) ≤ cf (n) for some constant c < 1 and all sufficiently large n.
Practice Problems
For each of the following recurrences, give an expression for the runtime T (n) if the recurrence can be
solved with the Master Theorem. Otherwise, indicate that the Master Theorem does not apply.
1. T (n) = 3T (n/2) + n2
2. T (n) = 4T (n/2) + n2
3. T (n) = T (n/2) + 2n
4. T (n) = 2n T (n/2) + nn
1
7. T (n) = 2T (n/2) + n/ log n
√
11. T (n) = 2T (n/2) + log n
√
13. T (n) = 3T (n/3) + n
2
Solutions
1. T (n) = 3T (n/2) + n2 =⇒ T (n) = Θ(n2 ) (Case 3)
7. T (n) = 2T (n/2) + n/ log n =⇒ Does not apply (non-polynomial difference between f (n) and nlogb a )
19. T (n) = 64T (n/8) − n2 log n =⇒ Does not apply (f (n) is not positive)
22. T (n) = T (n/2) + n(2 − cos n) =⇒ Does not apply. We are in Case 3, but the regularity condition is
violated. (Consider n = 2πk, where k is odd and arbitrarily large. For any such choice of n, you can
show that c ≥ 3/2, thereby violating the regularity condition.)
3
Recurrence Relations
Introduction
Determining the running time of a recursive algorithm often requires one to determine the big-O
growth of a function T (n) that is defined in terms of a recurrence relation. Recall that a recurrene
relation for a sequence of numbers is simply an equation that provides the value of the n th number
in terms of an algebraic expression involving n and one or more of the previous numbers of the
sequence. The most common recurrence relation we will encounter in this course is the uniform
divide-and-conquer recurrence relation, or uniform recurrence for short.
where a > 0 and b > 1 are integer constants. This equation is explained as follows.
T (n): T (n) denotes the number of steps required by some divide-and-conquer algorithm A on a
problem instance having size n.
Conquer Each subproblem instance has size n/b and hence is conquered in T (n/b) steps by making
a recursive call to A.
Combine f (n) represents the number of steps needed to both divide the original problem instance
and combine the a solutions into a final solution for the original problem instance.
For example,
T (n) = 7T (n/2) + n2 ,
is a uniform divide-and-conquer recurrence with a = 7, b = 2, and f (n) = n2 .
1
It should be emphasized that not every divide-and-conquer algorithm produces a uniform divide-
and-conquer recurrence. For example, the Median-of-Five Find Statistic algorithm described in the
next lecture produces the recurrence
The complexity analysis of a divide-and-conquer algorithm often reduces to determining the big-O
growth of a solution T (n) to a divide-and-conquer recurrence. In this lecture we examine three
different ways of solving such recurrences, which are summarized as follows.
Master Theorem The Master Theorem provides a solution T (n) to a uniform recurrence, pro-
vided a, b, and f (n) satisfy certain conditions.
Recursion Tree Given a recursive algorithm, a recursion tree is a tree whose nodes are in one-
to-one correspondence with the subproblem instances that are created by the algorithm at each
depth of the recursion. Moreover, each node of the tree is labeled with either i) the number
of algorithm steps that are needed to divide up the corresponding subproblem instance into
further subproblems and to combine their solutions to obtain a solution to the subproblem, or
ii) the number of steps needed to directly solve the subproblem instance if it represents a base
case (in this case the node is a leaf of the recursion tree).
Substitution Method The substitution method uses mathematical induction to prove that
some candidate function T (n) is a solution to a given divide-and-conquer recurrence. The
candidate solution is usually obtained by making an educated guess, or by analyzing the asso-
ciated recursion tree.
The Master theorem and substitution method represent proof methods, meaning that the application
of either method will yield a solution beyond doubt. On the other hand, the recursion-tree method
represents an exploratory method whose solution must be further verified using another method, such
as the substitution method.
2
Recursion Trees and The Master Theorem
Master Theorem. Let a ≥ 1 and b > 1 be constants, f (n) a function, and T (n) be defined on the
nonnegative integers by
T (n) = aT (n/b) + f (n).
Then T (n) can be bounded asymptotically as follows.
1. If f (n) = O(nlogb a− ) for some constant > 0, then T (n) = Θ(nlogb a ).
3. If f (n) = Ω(nlogb a+ ) for some constant > 0, and if af (n/b) ≤ cf (n) for some constant c < 1,
then T (n) = Θ(f (n)).
We prove a relaxed version of the Master theorem by assuming that n is always a power of b, instead
of always being an integer. The Master Theorem then follows from this theorem by more analysis
involving floors and ceilings, which shows that their inclusion does not affect the order of growth of
T (n).
Lemma 1. Let a ≥ 1 and b > 1 be constants, and let f (n) be a nonnegative function defined on
exact powers of b. Define T (n) on exact powers of b by the recurrence relation
Θ(1) if n = 1
T (n) =
aT (n/b) + f (n) if n = bi
for some positive integer i. Then
logb n−1
X
logb a
T (n) = Θ(n )+ aj f (n/bj ).
j=0
Proof of Lemma 1. Notice that, since n is a power of b, there are logb n + 1 levels of recursion:
0, 1, . . . , logb n. Moreover, level j, 0 ≤ j ≤ logb n − 1 contributes a total of aj f (n/bj ) to the total
value of T (n), while level logb n contributes Θ(1) · alogb n = Θ(nlogb a ) (see the general recursion-tree
in Figure 1 below). Hence,
logb n−1
X
logb a
T (n) = Θ(n )+ aj f (n/bj ).
j=0
QED
3
Figure 1: The general recursion tree for uniform divide-and-conquer recurrences
Lemma 2. Let
logb n−1
X
g(n) = aj f (n/bj ),
j=0
1. If f (n) = O(nlogb a− ) for some constant > 0, then g(n) = O(nlogb a )
2. If f (n) = Θ(nlogb a ) then g(n) = Θ(nlogb a · log n)
3. If f (n) = Ω(nlogb a+ ) for some constant > 0, and if af (n/b) ≤ cf (n) for some constant c < 1,
then g(n) = Θ(f (n))
4
under varying assumptions about f (n). In Case 1, replace f (n/bj ) with ( bnj )logb a− . In Case 2, replace
f (n/bj ) with ( bnj )logb a . And in Case 3, replace aj f (n/bj ) with cj f (n). In all cases, the formula
k
X rk+1 − 1
rj =
j=0
r−1
5
Example 1. Prove Case 1 of Lemma 2.
6
Once Lemma 2 is proved, the proof of the relaxed version of the Master Theorem becomes readily
available.
Example 2. Finish the proof of Case 1 of the relaxed version of the Master Theorem.
7
Example 3. Determine the order of growth of the solutions to the following recurrence relations.
2. T (n) = T (n/5) + 20
8
Example 4. Use a recursion tree to estimate the big-O growth of T (n) if it satisfies T (n) =
T (n − 2) + n.
9
Example 5. Use a recursion tree to estimate the big-O growth of T (n) if it satisfies T (n) =
T (n/2) + 2T (n/4) + n.
10
Substitution Method
The substitution method is an inductive method for proving the big-O growth of a function T (n)
that satisfies some divide-and-conquer recurrence. It requires that we already have a candidate
function g(n) for representing the growth of T (n). For example, suppose we desire to show that
T (n) = O(g(n)). Then we peroform the following two steps.
Inductive Assumption Assume that T (k) ≤ Cg(k), for all k < n and for some constant C > 0.
Prove Inductive Step Show that T (n) ≤ Cg(n). Do this by replacing any term aT (n/b) of the
recurrence with aCg(n/b), and show that the resulting non-recurrence expression is bounded
above by Cg(n).
Notice that there is no basis step to the above proof method. This is because we only care about the
growth of T (n) for sufficiently large n.
11
Example 6a. Show that if T (n) = 2T (n/2) + 3n, then T (n) = O(n log n).
12
Example 6b. Similarly, show that T (n) = Ω(n log n), where T (n) satisfies the recurrence from
Example 6a.
13
Sometimes it is necessary to add lesser-degree terms to the inductive assumption. For example,
instead of T (k) ≤ Ck 2 , one may instead use T (k) ≤ Ck 2 + Dk + E, for some constants C, D, E > 0.
This is sometimes necessary in order to cancel lesser-degree terms that occur in the recurrence. The
following example illustrates this.
14
Example 7. Show that if T (n) = 2T (n/2) + n log n, then T (n) = O(n log2 n).
15
Example 8. Prove that if T (n) = 2T (n/2) + 7 then T (n) = O(n).
16
Exercises
1. Use the Master Theorem to give tight asymptotic bounds for the following recurrences.
a. T (n) = 2T (n/4) + 1
√
b. T (n) = 2T (n/4) + n
c. T (n) = 2T (n/4) + n
d. T (n) = 2T (n/4) + n2
2. Use the Master Theorem to show that the solution to the binary-search recurrence T (n) =
T (n/2) + a, where a > 0 is a constant, is T (n) = Θ(log n).
3. Use the Master Theorem to determine the big-O growth of T (n) if it satisfies the recurrence
T (n) = 3T (n/2) + n.
4. Explain why the Master Theorem cannot be applied to the recurrence T (n) = 4T (n/2)+n2 log n.
5. Use the Master Equation to estimate the growth of T (n) which satisfies the recurrence from
Exercise 4. Note: you should use the substitution method to verify that the estimate is in fact
the exact big-O growth of T (n).
√
6. Solve the recurrence T (n) = 2T ( n) + log n. Hint: Let S(k) = T (2k ) and write a divide-and-
conquer recurrence for S(k).
7. Use a recursion tree to show that the solution of T (n) = T (n − 1) + n is T (n) = O(n2 ).
8. Use a recursion tree to solve the recurrence T (n) = T (n − 2) + n2 ; providing a tight upper and
lower bound.
9. Use a recursion tree to estimate the big-O growth of T (n) which satisifies the recurrence T (n) =
2T (n − 1) + 1.
10. Use a recursion tree to estimate the big-O growth of T (n) which satisifies the recurrence T (n) =
T (n/2) + n is T (n) = Θ(n). Verify your answer using the Master theorem.
11. Use a recursion tree to estimate the big-O growth of T (n) which satisifies the recurrence T (n) =
T (n/2) + n2 . Verify your answer using the Master theorem.
12. Argue that the solution to the recurrence T (n) = T (n/3) + T (2n/3) + an, where a > 0 is a
constant, is T (n) = Θ(n log n), by using an appropriate recursion tree.
13. Suppose T (n) and g(n) are positive functions that satisfy T (n) ≤ Cg(n) for all n ≥ k, and
some constant C > 0. Prove that There is a constant C 0 > 0 for which T (n) ≤ C 0 g(n), for all
n ≥ 0.
14. Use the substitution method to prove that T (n) = 2T (n/2) + an, a > 0 a constant, has solution
T (n) = Θ(n log n).
15. Use the substitution method to prove that T (n) = 4T (n/2) + n has solution T (n) = O(n2 ).
16. Use the substitution method to prove that T (n) = T (an)+T (bn)+n has solution T (n) = O(n),
where a and b are positive constants, with a + b < 1.
17
Exercise Solutions
1. Use the Master Theorem to give tight asymptotic bounds for the following recurrences.
√
a. Case 1: T (n) = Θ( n)
√
b. Case 2: T (n) = Θ( n log n)
c. Case 3: T (n) = Θ(n)
d. Case 3: T (n) = Θ(n2 )
2. Use Case 2 of the Master Theorem: T (n) = Θ(log n).
3. Use the Master Theorem to determine the big-O growth of T (n) if it satisfies T (n) = 3T (n/2)+
n. Use case 1. T (n) = Θ(nlog 3 ).
4. nlogb a = n2 and f (n) = n2 log n = ω(n2 ). Hence, Cases 1 and 2 do not apply. Moreover,
n2 log n 6= Ω(n2+ ), for all > 0. Therefore, Case 3 cannot be applied.
5. From the Master Equation we have
logb n−1 log2 n−1
X
j j
X n 2 n
a f (n/b ) = 4j ( j
) log( j ) =
j=0 j=0
2 2
log2 n−1
X
n 2
(log n − j) = n2 (log2 n − log n(log n − 1)/2) = Θ(n2 log2 n).
j=0
which, by the Master Theorem, yields S(k) = Θ(k log k). Finally, letting n = 2k , we have
T (n) = Θ(log n log(log n)).
7. The recursion tree consists
P of a single branch of length n, where the work done at depth i is
n − i. Hence, T (n) = O( ni=0 (n − i)) = O(n2 ).
8. The recursion tree consists of a single branch of length n/2, where the work done at depth i is
(n − 2i)2 . Hence,
n/2
X
T (n) = Θ( (n − 2i)2 ) = Θ(n3 ).
i=0
9. The recursion tree is a perfect binary tree having depth n. Thus, it has 2n+1 − 1 nodes.
Moreover, since the work at each node is always 1, we see that T (n) = Θ(2n ).
10. Assuming n is a power of 2, the total work from the recursion tree is
Factoring out n and adding the geometric series yields a sum that is Θ(n).
18
i 2 2 i
11. The recursion tree
P∞has a i single branch, where the work 2done at depth i is (n/2 ) = n /4 .
Moreover, since i=0 1/4 = 4/3, we see that T (n) = Θ(n ).
12. The recursion tree remains perfect all the way down to depth blog3 nc. Moreover, for each depth
i = 0, 1, . . . , blog3 nc, the work done at that depth always adds to n. Hence, the total work
(i.e. T (n)) must be Ω(n log n). Also, the longest branch has a depth not exceeding blog 3 nc.
2
Moreover, the amount of work at each depth does not exceed n. Hence T (n) = O(n log n).
Therefore, T (n) = Θ(n log n).
13. Let Ci , i = 0, 1, . . . , k − 1, be a constant for which T (i) ≤ Ci g(i). Since f (i) and g(i) are both
positive numbers, we know that such a Ci exists. Then choose C 0 = max(C0 , . . . , Ck−1 , C).
Then T (n) ≤ C 0 g(n), for all n ≥ 0.
iff c ≥ a. Therefore by induction, T (n) = O(n log n). T (n) = Ω(n log n) is proved similarly.
15. The hypothesis T (k) ≤ Ck 2 , for all k < n is not sufficient, since it leads to the inequality n ≤ 0.
Using T (k) ≤ Ck 2 + Dk yields
Dn + n ≤ 0 ⇔ D ≤ −1.
Therefore, the inequality is established so long as we choose D ≤ −1, and C > 0.
16. Assume T (k) ≤ Ck, for all k < n and some constant C > 0. Then
19