Recurrence
Recurrence
Asymptotic Notation II
and Recurrence Relation
1
Asymptotic notations
• O-notation
2
Asymptotic notations (cont.)
• Ω - notation
3
Asymptotic notations (cont.)
• Θ-notation
4
Exercise on O-notation
• Show that 3n2+2n+5 = O(n2)
c = 10, n0 = 1
5
Exercise on O-notation
• f1(n) = 10 n + 25 n2 • O(n2)
• f2(n) = 20 n log n + 5 n • O(n log n)
• f3(n) = 12 n log n + 0.05 n2 • O(n2)
• f4(n) = n1/2 + 3 n log n • O(n log n)
6
Big O Fact
• A polynomial of degree k is O(nk)
• Proof:
– Suppose f(n) = bknk + bk-1nk-1 + … + b1n + b0
• Let ai = | bi |
– f(n) ≤ aknk + ak-1nk-1 + … + a1n + a0
7
Comparisons of Functions
• Transitivity:
– f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))
– Same for O and Ω
• Reflexivity:
– f(n) = Θ(f(n))
– Same for O and Ω
• Symmetry:
– f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))
• Transpose symmetry:
– f(n) = O(g(n)) if and only if g(n) = Ω(f(n))
8
Simplifying Assumptions
1. If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n))
2. If f(n) = O(kg(n)) for any k > 0, then f(n) = O(g(n))
3. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
then f1(n) + f2(n) = O(max (g1(n), g2(n)))
4. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),
then f1(n) * f2(n) = O(g1(n) * g2(n))
9
Manipulating Asymptotic Notation
• cO(f(n)) = O(f(n))
• O(O(f(n))) = O(f(n))
• O(f(n))O(g(n)) = O(f(n)g(n))
• O(f(n)g(n)) = f(n)O(g(n))
• O(f(n)+g(n)) = ( max ( f(n),g(n))
10
More Examples
(a) 0.5n2 - 5n + 2 = Ω( n2).
Let c = 0.25 and n0 = 25.
0.5 n2 - 5n + 2 = 0.25( n2) for all n = 25
11
GENERAL RULES FOR
ANALYSIS(1/5)
1. Consecutive statements
– Maximum statement is the one counted
e.g. a fragment with single for-loop followed by double for-
loop is O(n2).
Block #1 t1
t1+t2 = max(t1,t2)
Block #2 t2
12
GENERAL RULES FOR
ANALYSIS(2/5)
If/Else
if cond then S1 if S1 else
else
S2
Block # 1 t1 Block # 2 t2
Max(t1,t2)
13
GENERAL RULES FOR
ANALYSIS(3/5)
3. For Loops
• Running time of a for-loop is at most the running
time of the statements inside the for-loop times
number of iterations
for (i = sum = 0; i < n; i++)
sum += a[i];
• for loop iterates n times, executes 2 assignment
statements each iteration ==> asymptotic
complexity of O(n)
14
GENERAL RULES FOR
ANALYSIS(4/5)
4. Nested For-Loops
• Analyze inside-out. Total running time is running
time of the statement multiplied by product of the
sizes of all the for-loops
e.g. for (i =0; i < n; i++)
for (j = 0, sum = a[0]; j <= i ; j++)
sum += a[j];
printf("sum for subarray - through %d is %d\n", i, sum);
15
GENERAL RULES FOR
ANALYSIS(5/5)
16
Recurrence Relations (1/2)
• A recurrence relation is an equation which is
defined in terms of itself.
• Why are recurrences good things?
– Many natural functions are easily expressed as
recurrences:
• an = a n-1 + 1, a1 = 1−−> an = n (polynomial)
18
Recurrence Equations
For example:
19
Recurrences
• The expression:
is a recurrence.
– Recurrence: an equation that describes a function in
terms of its value on smaller functions
20
Recurrence Examples
21
More Recurrence equations:
T(n) = 2 * T(n/2) + 1,
T(1) = 1. Base case;
initial condition.
T(n) = T(n-1) + n,
Selection
T(1) = 1.
Sort
T(n) = 2* T(n/2) + n, Merge Sort
T(1) = 1. Quick Sort
22
Methods for Solving Recurrences
• Iteration method
• Substitution method
• Master method
23
Simplications:
• There are two simplifications we apply that won't
affect asymptotic analysis
– ignore floors and ceilings (justification in text)
– assume base cases are constant, i.e., T(n) = Θ(1) for
n small enough
24
Solving Recurrences: Iteration
(convert to summation)
• Expand the recurrence
• Work some algebra to express as a summation
• Evaluate the summation
25
The Iteration Method
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c +
= c + c + T(n/4) T(n/4)
T(n/4) = c +
T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times
= clgn + T(1)
= Θ(lgn)
26
• s(n) =
c + s(n-1)
c + c + s(n-2)
2c + s(n-2)
2c + c + s(n-3)
3c + s(n-3)
…
kc + s(n-k) = ck + s(n-k)
27
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
28
• So far for n >= k we have
– s(n) = ck + s(n-k)
• What if k = n?
– s(n) = cn + s(0) = cn
• So
• Thus in general
– s(n) = cn
29
Solving Recurrences: Iteration
(convert to summation)
Example: T(n) = 4T(n/2) + n
T(n) = 4T(n/2) + n /**expand**/
= 4(n/2 + 4T(n/4)) + n /**simplify**/
= 16T(n/4) + 2n + n /**expand**/
= 16(n/4 + 4T(n/8)) + 2n + n = 43 T(n/23) + 4n + 2n
+1
= 4log n T(1)+ … + 4n + 2n + n /** #levels = log n **/
/** alog30b
log a
Solving Recurrences: Iteration
(convert to summation) (cont.)
= cn2+n(nlog 2 -1) /** 2log n = nlog 2 **/
= cn2 +n(n - 1)
= cn2 +n2 - n Proof:
alog b = blog a
= Θ(n2) Let alog b = K
log(alog b) = k
-> log b * log a = log (K)
Let blog a = K1
-> log a * log b = log (K1)
it has to be log (K) = log(k1)
k = k1 = b log a
31
The substitution method
1. Guess a solution
32
Substitution method
• Guess a solution
– T(n) = O(g(n))
– Induction goal: apply the definition of the asymptotic notation
33
Example: Binary Search
T(n) = c + T(n/2)
• Guess: T(n) = O(lgn)
– Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0
– Induction hypothesis: T(n/2) ≤ d lg(n/2)
34
Example 2
T(n) = T(n-1) + n
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
– Induction hypothesis: T(k-1) ≤ c(k-1)2 for all k < n
35
Example 3
T(n) = 2T(n/2) + n
• Guess: T(n) = O(nlgn)
– Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2)
36
Changing variables
T(n) = 2T( ) +
lgn
– Rename: m = lgn ⇒ n = 2m
T (2m) = 2T(2m/2) + m
– Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m ⇒ S(m) = O(mlgm)
(demonstrated before)
T(n) = T(2m) = S(m) =
O(mlgm)=O(lgnlglgn)
Idea: transform the recurrence to one that you
have seen before 37
Evaluate recursive equation
using Recursion Tree
• Evaluate: T(n) = T(n/2) + T(n/2) + n
– Work copy: T(k) = T(k/2) + T(k/2) + k
– For k=n/2, T(n/2) = T(n/4) + T(n/4) + (n/2)
• [size|cost]
38
Recursion-tree method
• A recursion tree models the costs (time) of
a recursive execution of an algorithm.
• The recursion tree method is good for
generating guesses for the substitution
method.
• The recursion-tree method can be
unreliable.
• The recursion-tree method promotes
intuition, however.
39
Recursion Tree e.g.
• To evaluate the total cost of the recursion tree
– sum all the non-recursive costs of all nodes
– = Sum (rowSum(cost of all nodes at the same depth))
• Determine the maximum depth of the recursion tree:
– For our example, at tree depth d
the size parameter is n/(2d)
– the size parameter converging to base case, i.e. case 1
– such that, n/(2d) = 1,
– d = lg(n)
– The rowSum for each row is n
• Therefore, the total cost, T(n) = n lg(n)
40
Example of recursion tree
41
Example of recursion tree
42
Example of recursion tree
43
Example of recursion tree
44
Example of recursion tree
Θ(1)
45
Example of recursion tree
Θ(1)
46
Example of recursion tree
Θ(1)
47
Example of recursion tree
…
…
Θ(1)
48
Example of recursion tree
…
…
Θ(1) Total
= geometric
Θ(n2)
49
series
The Master Method
• Based on the Master theorem.
• “Cookbook” approach for solving recurrences of
the form
T(n) = aT(n/b) + f(n)
• a ≥ 1, b > 1 are constants.
• f(n) is asymptotically positive.
• n/b may not be an integer, but we ignore floors and ceilings.
• Requires memorization of three cases.
The Master Theorem
Theorem 4.1
Let a ≥ 1 and b > 1 be constants, let f(n) be a function, and
Let T(n) be defined on nonnegative integers by the recurrence
T(n) = aT(n/b) + f(n), where we can replace n/b by ⎣n/b⎦ or ⎡n/b⎤.
T(n) can be bounded asymptotically in three cases:
1. If f(n) = O(n(logba)–ε) for some constant ε > 0, then T(n) = Θ(nlogba).
2. If f(n) = Θ(nlogba), then T(n) = Θ(nlogbalg n).
3. If f(n) = Ω(nlogba+ε) for some constant ε > 0,
and if, for some constant c < 1 and all sufficiently large n,
we have a·f(n/b) ≤ c f(n), then T(n) = Θ(f(n)).
Master Method – Examples
• T(n) = 16T(n/4)+n
– a = 16, b = 4, nlogba = nlog416 = n2.
– f(n) = n = O(nlogba-ε) = O(n2-ε ), where ε = 1 ⇒ Case 1.
– Hence, T(n) = Θ(nlogba ) = Θ(n2).
• T(n) = T(3n/7) + 1
– a = 1, b=7/3, and nlogba = nlog 7/3 1 = n0 = 1
– f(n) = 1 = Θ(nlogba) ⇒ Case 2.
– Therefore, T(n) = Θ(nlogba lg n) = Θ(lg n)
Master Method – Examples
• T(n) = 3T(n/4) + n lg n
– a = 3, b=4, thus nlogba = nlog43 = O(n0.793)
– f(n) = n lg n = Ω(nlog43 + ε ) where ε ≈ 0.2 ⇒ Case 3.
– Therefore, T(n) = Θ(f(n)) = Θ(n lg n).
nlogn ≠ O(n1-ε), T(n) is not Θ(n1)
nlogn ≠ Θ(n1), T(n) is not Θ(n1log n)
• T(n) = 2T(n/2) + n lg n nlogn ≠ 𝞨(n1+ε), T(n) is not Θ(f(n))
54