Recurrence-Relations Time Complexity
Recurrence-Relations Time Complexity
Recurrence Relations
In previous lectures we have discussed asymptotic analysis of algorithms and various properties
associated with asymptotic notation. As many algorithms are recursive in nature, it is natural
to analyze algorithms based on recurrence relations. Recurrence relation is a mathematical model
that captures the underlying time-complexity of an algorithm. In this lecture, we shall look at
three methods, namely, substitution method, recurrence tree method, and Master theorem to ana-
lyze recurrence relations. Solutions to recurrence relations yield the time-complexity of underlying
algorithms.
1 Substitution method
Consider a computational problem P and an algorithm that solves P . Let T (n) be the worst-
case time complexity of the algorithm with n being the input size. Let us discuss few examples
to appreciate how this method works. For searching and sorting, T (n) denotes the number of
comparisons incurred by an algorithm on an input size n.
Binary search
Input: Sorted array A of size n, an element x to be searched
Question: Is x ∈ A
Approach: Check whether A[n/2] = x. If x > A[n/2], then prune the lower half of the array,
A[1, . . . , n/2]. Otherwise, prune the upper half of the array. Therefore, pruning happens at every
iterations. After each iteration the problem size (array size under consideration) reduces by half.
Recurrence relation is T (n) = T (n/2) + O(1), where T (n) is the time required for binary search in
n
an array of size n. T (n) = T ( k ) + 1 + · · · + 1
2
Since T (1) = 1, when n = 2k , T (n) = T (1) + k = 1 + log2 (n).
log2 (n) ≤ 1 + log2 (n) ≤ 2 log2 (n) ∀ n ≥ 2.
T (n) = Θ(log2 (n)).
Similar to binary search, the ternary search compares x with A[n/3] and A[2n/3] and the problem
size reduces to n/3 for the next iteration. Therefore, the recurrence relation is T (n) = T (n/3) + 2,
and T (2) = 2. Note that there are two comparisons done at each iteration and due to which additive
factor ’2’ appears in T (n).
T (n) = T (n/3) + 2; ⇒ T (n/3) = T (n/9) + 2
⇒ T (n) = T (n/(3k )) + 2 + 2 + ... + 2 (2 appears k times)
When n = 3k , T (n) = T (1) + 2 × log3 (n) = Θ(log3 (n))
n
Further, we highlight that for k-way search, T (n) = T ( ) + k − 1 where T (k − 1) = k − 1. Solving
k
this, T (n) = Θ(logk (n)).
It is important to highlight that in asymptotic sense, binary search, ternary search and k-way
search (fixed k) are having same complexity as log2 n = log2 3 · log3 n and log2 n = log2 k · logk n.
So, θ(log2 n) = θ(log3 n) = θ(logk n), for fixed k.
In the above analysis of binary and ternary search, we assumed that n = 2k (n = 3k ). Is this
a valid assumption? Will the above analysis hold good if n 6= 2k . There is a simple fix to handle
input samples which are not 2k for any k. If the input array of size n is such that n 6= 2k for any
k, then we augment least number of dummy values to make n = 2k . By doing so, we are only
increasing the size by at most 2n. For ternary search, the array size increases by at most 3n. This
approximation does not change our asymptotic analysis as the search time would be one more than
the actual search time.
Sorting
To sort an array of n elements using find-max (returns maximum) as a black box.
Approach: Repeatedly find the maximum element and remove it from the array. The order in which
the maximum elements are extracted is the sorted sequence. The recurrence for the above algorithm
is,
(n − 1)n
T (n) = T (n − 1) + n − 1 = T (n − 2) + n − 2 + n − 1 = T (1) + 1 + · · · + n − 1 =
2
T (n) = Θ(n2 )
Merge Sort
Approach: Divide the array into two equal sub arrays and sort each sub array recursively. Do the
sub-division operation recursively till the array size becomes one. Trivially, the problem size one
is sorted and when the recursion bottoms out two sub problems of size one are combined to get a
sorting sequence of size two, further, two sub problems of size two (each one is sorted) are combined
to get a sorting sequence of size four, and so on. We shall see the detailed description of merge sort
when we discuss divide and conquer paradigm. The recurrence for the merge sort is,
n n n
T (n) = 2T ( ) + n − 1 = 2[2T ( 2 ) + − 1] + n − 1
2 2 2
n
=⇒ 2k T ( k ) + n − 2k + n − 2k−1 + · · · + n − 1.
2
When n = 2k , T (n) = 2k T (1) + n + · · · + n − [2k−1 + · · · + 20 ]
2k−1+1 − 1
Note that 2k−1 + · · · + 20 = = 2k − 1 = n − 1
2−1
Also, T (1) = 0 as there is no comparison required if n = 1. Therefore, T (n) = n log2 (n) − n + 1 =
Θ(n log2 (n))
Heap sort
This sorting is based on the data structure max-heap which we shall discuss in detail at a later
chapter. Creation of max-heap can be done in linear time. The approach is to delete the maximum
element repeatedly and set right the heap to satisfy max-heap property. This property maintenance
incur O(log n) time. The order in which the elements are deleted gives the sorted sequence. Number
of comparisons needed for deleting an element is equal to at most the height of the max-heap, which
is log2 (n). Therefore the recurrence for heap sort is,
T (n) = T (n − 1) + log2 (n) where T (1) = 0 ⇒ T (n) = (T (n − 2) + log2 (n − 1)) + log2 (n)
By substituting further, ⇒ T (n) = T (1) + log 2 + log 3 + log 4 + ... + log n
⇒ T (n) = log(2.3.4...n) ⇒ T (n) = log(n!)
⇒ log(n!) ≤ n log n as n! ≤ nn (Stirling’s Approximation)
⇒ T (n) = Θ(n log2 (n))
Introduce a change m
m
√ of variable by letting n = 2 .
m
⇒ T (2 ) = 2 × T ( 2 ) + 1
⇒ T (2m ) = 2 × T (2m/2 ) + 1
Let us introduce another change by letting S(m) = T (2m )
⇒ S(m) = 2 × S(m/2) + 1
⇒ S(m) = 2 × (2 × S(m/4) + 1) + 1
⇒ S(m) = 22 × S(m/22 ) + 2 + 1
By substituting further,
⇒ S(m) = 3 + 2k − 1 ⇒ S(m) = m + 2.
We now have, S(m) = T (2m ) = m+2. Thus, we get T (n) = m+2, Since m = log n, T (n) = log n+2
Therefore, T (n) = θ(log n)
√
Problem: 2 T (n) = 2 ∗ T ( n) + n and T (1) = 1
√
Let n = 2m ⇒ T (2m ) = 2 × T ( 2m ) + 2m
⇒ T (2m ) = 2 × T (2m/2 ) + 2m
let S(m) = T (2m ) ⇒ S(m) = 2 × S(m/2) + 2m
From the first term of the above expression, it is clear that S(m) ≥ m.2m , therefore, T (n) =
Ω(n. log n)
√
Problem: 3 T (n) = 2 ∗ T ( n) + log n and T (1) = 1
let n = 2m
√
⇒ T (2m ) = 2 ∗ T ( 2m ) + log(2m )
⇒ T (2m ) = 2 ∗ T (2m/2 ) + m
⇒ S(m) = 2 ∗ S(m/2) + m
⇒ S(m) = 22 ∗ S(m/22 ) + m + m
By substituting further,
⇒ S(m) = 2 + m(k − 1) + m
⇒ S(m) = 2 + m.k
⇒ S(m) = 2 + m log m
While substitution method works well for many recurrence relations, it is not a suitable technique
for recurrence relations that model divide and conquer paradigm based algorithms. Recursion Tree
Method is a popular technique for solving such recurrence relations, in particular for solving un-
balanced recurrence relations. For example, in case of modified merge Sort, to solve a problem of
size n (to sort an array of size n), the problem is divided into two problems of size n/3 and 2n/3
each. This is done recursively until the problem size becomes 1.
#
! "
! !
! "
! ! ""
! !
$%&''(
" "
log2P
(n)−1
i = 2log2 (n) − 1 = n − 1
i=0
Therefore, the total time = n + n − 1 = 2n − 1 = Θ(n).
2. T (n) = 2T (n/2) + 1
T (1) = log n
!
!"# !"#
!"#()'$%&*+,-.("
! !
! !
Fig. 2. Input size reduction tree The Corresponding Computation tree
n
5. T (n) = T ( 10 ) + T ( 9n
10 ) + n
Note that the leaves of the computation tree are found between levels log10 (n) and log 10 (n)
9
Assume all the leaves are at level log10 (n)
Then T (n) ≥ n log10 (n) =⇒ T (n) = Ω(n log(n))
Assume all the leaves are at level log 10 (n)
9
Then T (n) ≤ n log 10 (n) =⇒ T (n) = O(n log(n))
9
Solution using Guess method
One can guess the solution to recurrence relation T (n) and verify the guess by simple substitu-
tion.
For T (n) = T (n/3) + T (2n/3) + O(n), Guess T (n) ≤ dn log(n).
Substituting the guess, we get
T (n) = dn/3 log n/3 + dn2/3 log 2n/3 + cn
T (n) = dn/3 log n − dn/3 log 3 + d2n/3 log 2n − d2n/3 log 3 + cn
T (n) = dn log n − dn log 3 + d2n/3 + cn
Since T (n) is at most dn log n, we get
dn log n − dn log 3 + d2n/3 + cn ≤ dn log n =⇒ cn ≤ dn(log 3 − 2/3)
c ≤ d(log 3 − 2/3) Choose ‘c0 and ‘d0 such that the inequality is respected
∴ T (n) ≤ dn log n = O(n log n)
6. T (n) = 2T (n/2) + n, T (1) = 1
Solution: Guess T (n) = O(n2 ). T (n) ≤ 2.cn2 /4 + n = cn2 /2 + n ≤ cn2 (Possible)
Therefore, T (n) = O(n2 ).
7. T (n) = 2T (n/2) + 1
Guess : T (n) = O(log n)
T (n) ≤ 2c log n/2 + 1
= 2c log n − 2c + 1
2c log n − 2c + 1 ≤ c log n (Not possible)
Therefore, T (n) 6= O(log n)
Guess : T (n) = n − d
T (n) ≤ 2.(n/2 − d) + 1 = n − 2d + 1
n − 2d + 1 ≤ n − d (for d ≥ 1))
Therefore, T (n) = n − d = O(n)
We shall now look at a method ’master theorem’ which is a ’cook book’ for many well-known re-
currence relations. It presents a framework and formulae using which solutions to many recurrence
relations can be obtained very easily. Almost all recurrences of type T (n) = aT (n/b) + f (n) can
be solved easily by doing a simple check and identifying one of the three cases mentioned in the
following theorem. By comparing nlogb a (the number of leaves) with f (n), one can decide upon the
time complexity of the algorithm.
Master Theorem Let a ≥ 1 and b ≥ 1 be constants, let f (n) be a non negative function,
and let T (n) be defined on the non-negative integers by the recurrence
T (n) = aT (n/b) + f (n)
where we interpret n/b to be either bn/bc or dn/be. Then T (n) has the following asymptotic bounds:
Case 1: If f (n)=O(nlogb a− ) for some constant ≥ 0 Then T (n) = θ(nlogb a )
Case 2: If f (n)=θ(nlogb a ) then T (n) = θ(nlogb a log n)
Case 3: If f (n) = Ω(nlogb a+ ) for some constant >0, and if af (n/b) ≤ cf (n) for
some constant c<1 and all sufficiently large n, then T (n) = θ((f (n)).
In the next section, we shall solve recurrence relations by applying master theorem. Later, we
discuss a proof of master’s theorem and some technicalities to be understood before applying mas-
ter theorem.
1. T (n) = 9T (n/3) + n
nlogb a = nlog3 9 = n2
f (n) = n
Comparing nlogb a and f (n)
n = O(n2 )
Satisfies Case 1 of Master’s Theorem
That implies T (n) = Θ(nlogb a ) = Θ(n2 )
2. T (n) = T (2n/3) + 1
nlogb a = nlog4 3
f (n) = n log n
Comparing nlogb a and f (n)
Satisfies Case 3 of Master’s Theorem
Checking the regularity condition
a.f (n/b) < c.f (n) (for some constant c < 1)
(3n/4) log n/4 < c.n log n
(3/4)n[log n − log 4] < (3/4)n log n where (c=3/4)
That implies the regularity condition is satisfied
That implies T (n) = Θ(f (n)) = Θ(n log n)
5. T (n) = 4T (n/2) + n
nlogb a = nlog2 4 = n2
f (n) = n
Comparing nlogb a and f (n)
n = O(n2 )
Satisfies Case 1 of Master’s Theorem
That implies T (n) = Θ(nlogb a ) = Θ(n2 )
6. T (n) = 4T (n/2) + n2
nlogb a = nlog2 2 = n1 = n
Comparing nlogb a = n and f (n) = n log n
Doesn’t Satisfy either case 1 or 2 or 3 of the master’s theorem
Case 3 states that f (n) should be polynomially larger but here it is asymptotically larger than
nlogb a only by a factor of log n
Note: If f (n) is polynomially larger than g(n), then fg(n) (n)
= n , > 0. Note that in the
above recurrence n log n is asymptotically larger than nlogb a but not polynomially larger. i.e.,
n
n = O(n log n) whereas n log n 6= n , for any > 0
nlogb a = nlog2 4 = n2
Comparing nlogb a = n2 and f (n) = n2 log n
Doesn’t Satisfy either case 1 or 2 or 3 of the master’s theorem
Case 3 states that f (n) should be polynomially larger but here it is asymptotically larger than
nlogb a by a factor of log n
If recurrence relations fall into the gap between case 1 and case 2 or case 2 and case 3, master
theorem can not be applied and such recurrences can be solved using recurrence tree method.
Technicalities:
– From the above examples, it is clear that in each of the three cases, we compare the function
f (n) with the function nlogb a . Intuitively, the larger of the two functions determines the solu-
tion to the recurrence. If, as in case 1, the function nlogb a is the larger, then the solution is
T (n) = θ(nlogb a ).
– If, as in case 3, the function f (n) is the larger, then the solution is T (n) = θ((f (n)).
– If, as in case 2, the two functions are the same size, we multiply by a logarithmic factor which
comes from the height of the tree, and the solution is T(n)=θ(nlogb a log n).
– Also, in the first case, not only must f (n) be smaller than nlogb a , it must be polynomially
smaller. That is, f (n) must be asymptotically smaller than nlogb a by a factor of n for some
constant > 0.
– In the third case, not only must f (n) be larger than nlogb a , it also must be polynomially larger
and in addition satisfy the regularity condition that af (n/b) ≤ cf (n).
– Note that the three cases do not cover all the possibilities for f (n). There is a gap between cases
1 and 2 when f (n) is smaller than nlogb a but not polynomially smaller.
– Similarly, there is a gap between cases 2 and 3 when f (n) is larger than nlogb a but not polyno-
mially larger. If the function f (n) falls into one of these gaps, or if the regularity condition in
case 3 fails to hold, you cannot use the master method to solve the recurrence.
– Here is a recurrence relation which satisfies the case 3 of master theorem but fails to satisfy
regularity condition.
1. T (n) = 2T ( n2 ) + n2 (1 + sin(n))
Here f (n) = n2 (1 + sin(n)), n2 (1 + sin(n)) ≤ c.nlog2 2+ where = 1. Therefore, case 3 of
master’s theorem applies. Now we shall check the regularity lemma.
2.( n2 )2 (1 + sin( n2 )) ≤ c.n2 (1 + sin(n)). Note that for every odd value m, sin( 2m+1
2 π) = −1
and sin( 2m+1 √1
4 π) = − 2 . When n =
2m+1 1 2 √1 2
2 , it implies that 2 .n (1 − 2 ) ≤ c.n (1 − 1) and
such a c does not exist.
2. T (n) = 2T ( n2 ) + n2 .2−n
Here f (n) = n2 .2−n = O(nlog2 2+ ), = 1. For checking regularity lemma,
n n n
2.( n2 )2 .2− 2 ≤ c.n2 .2−n It follows that 12 2− 2 ≤ c.2−n =⇒ c ≥ 12 .2 2 , which is not possible
as c < 1.
Lemma 2 Let a ≥ 1 & b ≥ 1 let f (n) be a non-negative function defined on exact powers of ‘b’.
Define T (n) on exact powers of b by the recurrence by.
θ(1) if n = 1
T (n) =
aT (n/b) + f (n) if n = bi
Plog n−1 j
Then, T (n) = θ(nlogb a ) + j=0b a f (n/bj )
At level i , there are ai subproblems each of complexity f (n/bi )
!"#$ (
!"#$
(*)!"#%&*$
! !
Fig. 4. Recursion Tree
logP
b n−1
T(n) = θ(nlogb a ) + aj f (n/bj )
j=0
=⇒ g(n)=nlogb a− [(b )logb n − 1/(b − 1)] =⇒ g(n)=nlogb a− [(n − 1/(b − 1)]
Case 2: f(n)=θ(nlogb a )
logP
b n−1
Recall, g(n)= aj f (n/bj )
j=0
logP
b n−1
Since f (n/bj )=θ((n/bj )logb a ), g(n)= aj (n/bj )logb a
j=0
logP
b n−1 logP
b n−1
g(n)= aj nlogb a /((bj )logb a ) =⇒ g(n)=nlogb a aj 1/((blogb a )j )
j=0 j=0
logP
b n−1 logP
b n−1
=⇒ g(n)=nlogb a aj /aj =⇒ g(n)=nlogb a 1
j=0 j=0
Not all recurrence relations can be solved using recurrence tree, masters theorem and substitu-
tion method. We here mention a method from difference equation to solve homogenous recurrence
relations. That is, recurrence relations which depends on r previous iterations (terms). Particularly,
recurrence relations of the form T (n) = c1 T (n − 1) + c2 T (n − 2) + · · · + cr T (n − r). In the next
section, we shall discuss a method for solving well-known recurrences like ’Fibonacci series’ using
’characteristic equation’ based approach.
2. an − 4an−1 + 4an−2 = 2n
T (n) − 4T (n − 1) + 4T (n − 2) = 2n
Characteristic equation is x2 − 4x + 4 = 0 and the roots are x = 2, 2
Complementary function (C.F.) is c1 .2n + c2 .n.2n
For P.I, note that the guesses d.2n and d.n.2n will not work as the roots and base of the expo-
nent (non-homogenous) term are same. Therefore we guess d.n2 .2n
d.n2 .2n − 4d(n − 1)2 2n−1 + d.(n − 2)2 2n−2 = 2n
d.n2 − 2d(n2 − 2n + 1) + d.(n2 − 4n + 4) = 1
Simplifying we get d = 12
n2 .2n
Therefore T (n) = c1 2n + c2 .n.2n +
2
Exercise
References:
1. E.Horowitz, S.Sahni, S.Rajasekaran, Fundamentals of Computer Algorithms, Galgotia Publica-
tions.
2. T.H. Cormen, C.E. Leiserson, R.L.Rivest, C.Stein, Introduction to Algorithms, PHI.
3. Sara Baase, A.V.Gelder, Computer Algorithms, Pearson.