CSE-304
Design & Analysis of Algorithm
Asymptotic Notation II
and Recurrence Relation
1
Asymptotic notations
• O-notation
• Intuitively: O(g(n)) = the set of
functions with a smaller or
same order of growth as g(n)
2
Asymptotic notations (cont.)
• Ω - notation
• Intuitively: Ω(g(n)) = the set of
functions with a larger or same
order of growth as g(n)
3
Asymptotic notations (cont.)
• Θ-notation
• Intuitively Θ(g(n)) = the set of
functions with the same order of
growth as g(n)
4
Exercise on O-notation
• Show that 3n2+2n+5 = O(n2)
10 n2 = 3n2 + 2n2 + 5n2
≥ 3n2 + 2n + 5 for n ≥ 1
c = 10, n0 = 1
5
Exercise on O-notation
• f1(n) = 10 n + 25 n2 • O(n2)
• f2(n) = 20 n log n + 5 n • O(n log n)
• f3(n) = 12 n log n + 0.05 n2 • O(n2)
• f4(n) = n1/2 + 3 n log n • O(n log n)
6
Big O Fact
• A polynomial of degree k is O(nk)
• Proof:
– Suppose f(n) = bknk + bk-1nk-1 + … + b1n + b0
• Let ai = | bi |
– f(n) ≤ aknk + ak-1nk-1 + … + a1n + a0
7
Comparisons of Functions
• Transitivity:
– f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))
– Same for O and Ω
• Reflexivity:
– f(n) = Θ(f(n))
– Same for O and Ω
• Symmetry:
– f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))
• Transpose symmetry:
– f(n) = O(g(n)) if and only if g(n) = Ω(f(n))
8
Simplifying Assumptions
1. If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n))
2. If f(n) = O(kg(n)) for any k > 0, then f(n) = O(g(n))
3. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
then f1(n) + f2(n) = O(max (g1(n), g2(n)))
4. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
then f1(n) * f2(n) = O(g1(n) * g2(n))
9
Manipulating Asymptotic Notation
• cO(f(n)) = O(f(n))
• O(O(f(n))) = O(f(n))
• O(f(n))O(g(n)) = O(f(n)g(n))
• O(f(n)g(n)) = f(n)O(g(n))
• O(f(n)+g(n)) = ( max ( f(n),g(n))
10
More Examples
(a) 0.5n2 - 5n + 2 = Ω( n2).
Let c = 0.25 and n0 = 25.
0.5 n2 - 5n + 2 = 0.25( n2) for all n = 25
(b) 0.5 n2 - 5n + 2 = O( n2).
Let c = 0.5 and n0 = 1.
0.5( n2) = 0.5 n2 - 5n + 2 for all n = 1
(c) 0.5 n2 - 5n + 2 = Θ( n2)
from (a) and (b) above.
Use n0 = 25, c1 = 0.25, c2 = 0.5 in the definition.
11
GENERAL RULES FOR
ANALYSIS(1/5)
1. Consecutive statements
– Maximum statement is the one counted
e.g. a fragment with single for-loop followed by double for-
loop is O(n2).
Block #1 t1
t1+t2 = max(t1,t2)
Block #2 t2
12
GENERAL RULES FOR
ANALYSIS(2/5)
If/Else
if cond then S1 if S1 else
else
S2
Block # 1 t1 Block # 2 t2
Max(t1,t2)
13
GENERAL RULES FOR
ANALYSIS(3/5)
3. For Loops
• Running time of a for-loop is at most the running
time of the statements inside the for-loop times
number of iterations
for (i = sum = 0; i < n; i++)
sum += a[i];
• for loop iterates n times, executes 2 assignment
statements each iteration ==> asymptotic
complexity of O(n)
14
GENERAL RULES FOR
ANALYSIS(4/5)
4. Nested For-Loops
• Analyze inside-out. Total running time is running
time of the statement multiplied by product of the
sizes of all the for-loops
e.g. for (i =0; i < n; i++)
for (j = 0, sum = a[0]; j <= i ; j++)
sum += a[j];
printf("sum for subarray - through %d is %d\n", i, sum);
15
GENERAL RULES FOR
ANALYSIS(5/5)
16
Recurrence Relations (1/2)
• A recurrence relation is an equation which is
defined in terms of itself.
• Why are recurrences good things?
– Many natural functions are easily expressed as
recurrences:
• an = a n-1 + 1, a1 = 1−−> an = n (polynomial)
• an = 2a n-1 ,a1 = 1−−> an = 2n (exponential)
• an = na n-1 ,a1 = 1−−> an = n! (weird function)
• It is often easy to find a recurrence as the
solution of a counting problem
17
Recurrence Relations (2/2)
• In both, we have general and boundary
conditions, with the general condition breaking
the problem into smaller and smaller pieces.
• The initial or boundary condition terminate the
recursion (base case).
18
Recurrence Equations
• A recurrence equation defines a function, say T(n). The function
is defined recursively, that is, the function T(.) appear in its
definition. (recall recursive function call). The recurrence
equation should has a base case.
For example:
T(n) = T(n-1)+T(n-2), if n>1
1, if n=1 or n=0.
base case
for convenient, we sometime write the recurrence equation as:
T(n) = T(n-1)+T(n-2)
T(0) = T(1) = 1.
19
Recurrences
• The expression:
is a recurrence.
– Recurrence: an equation that describes a function in
terms of its value on smaller functions
20
Recurrence Examples
21
More Recurrence equations:
T(n) = 2 * T(n/2) + 1,
T(1) = 1. Base case;
initial condition.
T(n) = T(n-1) + n,
Selection
T(1) = 1.
Sort
T(n) = 2* T(n/2) + n, Merge Sort
T(1) = 1. Quick Sort
T(n) = 2*T(n/2) + log n, Heap Construction
T(1) = 1.
T(n) = T(n/2) + 1, Binary
T(1) = 0. search
22
Methods for Solving Recurrences
• Iteration method
• Substitution method
• Recursion tree method
• Master method
23
Simplications:
• There are two simplifications we apply that won't
affect asymptotic analysis
– ignore floors and ceilings (justification in text)
– assume base cases are constant, i.e., T(n) = Θ(1) for
n small enough
24
Solving Recurrences: Iteration
(convert to summation)
• Expand the recurrence
• Work some algebra to express as a summation
• Evaluate the summation
25
The Iteration Method
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c +
= c + c + T(n/4) T(n/4)
T(n/4) = c +
T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times
= clgn + T(1)
= Θ(lgn)
26
• s(n) =
c + s(n-1)
c + c + s(n-2)
2c + s(n-2)
2c + c + s(n-3)
3c + s(n-3)
…
kc + s(n-k) = ck + s(n-k)
27
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
28
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
• So
• Thus in general
– s(n) = cn
29
Solving Recurrences: Iteration
(convert to summation)
Example: T(n) = 4T(n/2) + n
T(n) = 4T(n/2) + n /**expand**/
= 4(n/2 + 4T(n/4)) + n /**simplify**/
= 16T(n/4) + 2n + n /**expand**/
= 16(n/4 + 4T(n/8)) + 2n + n = 43 T(n/23) + 4n + 2n
+1
= 4log n T(1)+ … + 4n + 2n + n /** #levels = log n **/
= c4log n + /** convert to
summation**/
/** alog30b
log a
Solving Recurrences: Iteration
(convert to summation) (cont.)
= cn2+n(nlog 2 -1) /** 2log n = nlog 2 **/
= cn2 +n(n - 1)
= cn2 +n2 - n Proof:
alog b = blog a
= Θ(n2) Let alog b = K
log(alog b) = k
-> log b * log a = log (K)
Let blog a = K1
-> log a * log b = log (K1)
it has to be log (K) = log(k1)
k = k1 = b log a
31
The substitution method
1. Guess a solution
2. Use induction to prove that the
solution works
32
Substitution method
• Guess a solution
– T(n) = O(g(n))
– Induction goal: apply the definition of the asymptotic notation
• T(n) ≤ d g(n), for some d > 0 and n ≥ n0
– Induction hypothesis: T(k) ≤ d g(k) for all k < n
• Prove the induction goal
– Use the induction hypothesis to find some values of the
constants d and n0 for which the induction goal holds
33
Example: Binary Search
T(n) = c + T(n/2)
• Guess: T(n) = O(lgn)
– Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0
– Induction hypothesis: T(n/2) ≤ d lg(n/2)
• Proof of induction goal:
T(n) = T(n/2) + c ≤ d lg(n/2) + c
= d lgn – d + c ≤ d lgn
if: – d + c ≤ 0, d ≥ c
34
Example 2
T(n) = T(n-1) + n
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
– Induction hypothesis: T(k-1) ≤ c(k-1)2 for all k < n
• Proof of induction goal:
T(n) = T(n-1) + n ≤ c (n-1)2 + n
= cn2 – (2cn – c - n) ≤ cn2
if: 2cn – c – n ≥ 0 ⇔ c ≥ n/(2n-1) ⇔ c ≥ 1/(2
– 1/n)
– For n ≥ 1 ⇒ 2 – 1/n ≥ 1 ⇒ any c ≥ 1 will work
35
Example 3
T(n) = 2T(n/2) + n
• Guess: T(n) = O(nlgn)
– Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2)
• Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0 ⇒ c ≥ 1
36
Changing variables
T(n) = 2T( ) +
lgn
– Rename: m = lgn ⇒ n = 2m
T (2m) = 2T(2m/2) + m
– Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m ⇒ S(m) = O(mlgm)
(demonstrated before)
T(n) = T(2m) = S(m) =
O(mlgm)=O(lgnlglgn)
Idea: transform the recurrence to one that you
have seen before 37
Evaluate recursive equation
using Recursion Tree
• Evaluate: T(n) = T(n/2) + T(n/2) + n
– Work copy: T(k) = T(k/2) + T(k/2) + k
– For k=n/2, T(n/2) = T(n/4) + T(n/4) + (n/2)
• [size|cost]
38
Recursion-tree method
• A recursion tree models the costs (time) of
a recursive execution of an algorithm.
• The recursion tree method is good for
generating guesses for the substitution
method.
• The recursion-tree method can be
unreliable.
• The recursion-tree method promotes
intuition, however.
39
Recursion Tree e.g.
• To evaluate the total cost of the recursion tree
– sum all the non-recursive costs of all nodes
– = Sum (rowSum(cost of all nodes at the same depth))
• Determine the maximum depth of the recursion tree:
– For our example, at tree depth d
the size parameter is n/(2d)
– the size parameter converging to base case, i.e. case 1
– such that, n/(2d) = 1,
– d = lg(n)
– The rowSum for each row is n
• Therefore, the total cost, T(n) = n lg(n)
40
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
41
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
T(n)
42
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
T(n/4) T(n/2)
43
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
T(n/16) T(n/8) T(n/8) T(n/4)
44
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
…
Θ(1)
45
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
…
Θ(1)
46
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
…
Θ(1)
47
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
…
…
Θ(1)
48
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
…
…
Θ(1) Total
= geometric
Θ(n2)
49
series
The Master Method
• Based on the Master theorem.
• “Cookbook” approach for solving recurrences of
the form
T(n) = aT(n/b) + f(n)
• a ≥ 1, b > 1 are constants.
• f(n) is asymptotically positive.
• n/b may not be an integer, but we ignore floors and ceilings.
• Requires memorization of three cases.
The Master Theorem
Theorem 4.1
Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and
Let T(n) be defined on nonnegative integers by the recurrence
T(n) = aT(n/b) + f(n), where we can replace n/b by ⎣n/b⎦ or ⎡n/b⎤.
T(n) can be bounded asymptotically in three cases:
1. If f(n) = O(n(logba)–ε) for some constant ε > 0, then T(n) = Θ(nlogba).
2. If f(n) = Θ(nlogba), then T(n) = Θ(nlogbalg n).
3. If f(n) = Ω(nlogba+ε) for some constant ε > 0,
and if, for some constant c < 1 and all sufficiently large n,
we have a·f(n/b) ≤ c f(n), then T(n) = Θ(f(n)).
Master Method – Examples
• T(n) = 16T(n/4)+n
– a = 16, b = 4, nlogba = nlog416 = n2.
– f(n) = n = O(nlogba-ε) = O(n2-ε ), where ε = 1 ⇒ Case 1.
– Hence, T(n) = Θ(nlogba ) = Θ(n2).
• T(n) = T(3n/7) + 1
– a = 1, b=7/3, and nlogba = nlog 7/3 1 = n0 = 1
– f(n) = 1 = Θ(nlogba) ⇒ Case 2.
– Therefore, T(n) = Θ(nlogba lg n) = Θ(lg n)
Master Method – Examples
• T(n) = 3T(n/4) + n lg n
– a = 3, b=4, thus nlogba = nlog43 = O(n0.793)
– f(n) = n lg n = Ω(nlog43 + ε ) where ε ≈ 0.2 ⇒ Case 3.
– Therefore, T(n) = Θ(f(n)) = Θ(n lg n).
nlogn ≠ O(n1-ε), T(n) is not Θ(n1)
nlogn ≠ Θ(n1), T(n) is not Θ(n1log n)
• T(n) = 2T(n/2) + n lg n nlogn ≠ 𝞨(n1+ε), T(n) is not Θ(f(n))
– a = 2, b=2, f(n) = n lg n, and nlogba = nlog22 = n
– f(n) is asymptotically larger than nlogba, but not
polynomially larger. The ratio lg n is asymptotically less
than nε for any positive ε. Thus, the Master Theorem
doesn’t apply here.
Summary
Summary of master method
O(n(logba)–ε) Θ(nlogba) Ω(nlogba+ε)
Θ(n )
logba
Θ(n lg n)
logba
Θ(f(n))
Condition
a·f(n/b) ≤ c f(n)
where c < 1
54