ch02n Fundamentals of The Analysis of Algorithm Efficiency
ch02n Fundamentals of The Analysis of Algorithm Efficiency
Approaches:
• theoretical analysis
• empirical analysis
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-2
Theoretical analysis of time efficiency
Time efficiency is analyzed by determining the number of
repetitions of the basic operation as a function of input size
T(n) ≈ copC(n)
running time execution time Number of times
for basic operation basic operation is
or cost executed
Visiting a vertex or
Typical graph problem #vertices and/or edges
traversing an edge
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-4
Empirical analysis of time efficiency
Select a specific (typical) sample of inputs
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-5
Best-case, average-case, worst-case
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-6
Example: Sequential search
Worst case
n key comparisons
Best case
1 comparisons
Average case (n+1)/2, assuming K is in A
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-7
Types of formulas for basic operation’s count
Exact formula
e.g., C(n) = n(n-1)/2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-8
Order of growth
Most important: Order of growth within a constant multiple
as n→∞
Example:
• How much faster will algorithm run on computer that is
twice as fast?
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-9
Values of some important functions as n
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-10
Asymptotic order of growth
A way of comparing functions that ignores constant factors and
small input sizes (because?)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-11
Big-oh
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-12
Big-omega
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-13
Big-theta
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-14
Establishing order of growth using the definition
Examples:
10n is in O(n2)
5n+20 is in O(n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-15
-notation
Formal definition
• A function t(n) is said to be in (g(n)), denoted t(n)
(g(n)), if t(n) is bounded below by some constant multiple
of g(n) for all large n, i.e., if there exist some positive
constant c and some nonnegative integer n0 such that
t(n) cg(n) for all n n0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-16
-notation
Formal definition
• A function t(n) is said to be in (g(n)), denoted t(n) (g(n)),
if t(n) is bounded both above and below by some positive
constant multiples of g(n) for all large n, i.e., if there exist some
positive constant c1 and c2 and some nonnegative integer n0
such that
c2 g(n) t(n) c1 g(n) for all n n0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-17
>=
(g(n)), functions that grow at least as fast as g(n)
=
(g(n)), functions that grow at the same rate as g(n)
g(n)
<=
O(g(n)), functions that grow no faster than g(n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-18
Theorem
If t1(n) O(g1(n)) and t2(n) O(g2(n)), then
t1(n) + t2(n) O(max{g1(n), g2(n)}).
• The analogous assertions are true for the -notation
and -notation.
f(n) O(f(n))
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-20
Establishing order of growth using limits
Examples:
• 10n vs. n2
• n(n+1)/2 vs. n2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-21
L’Hôpital’s rule and Stirling’s formula
Example: 2n vs. n!
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-22
Orders of growth of some important functions
All logarithmic functions loga n belong to the same class
(log n) no matter what the logarithm’s base a > 1 is
because log a n log b n / log b a
All polynomials of the same degree k belong to the same class:
order log n < order n (>0) < order an < order n! < order nn
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-23
Basic asymptotic efficiency classes
1 constant
log n logarithmic
n linear
n log n n-log-n
n2 quadratic
n3 cubic
2n exponential
n! factorial
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-24
Time efficiency of nonrecursive algorithms
General Plan for Analysis
Decide on parameter n indicating input size
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-25
Useful summation formulas and rules
lin1 = 1+1+…+1 = n - l + 1
In particular, lin1 = n - 1 + 1 = n (n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-27
Example 2: Element uniqueness problem
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-32
Example 1: Recursive evaluation of n!
Definition: n ! = 1 2 … (n-1) n for n ≥ 1 and 0! = 1
Size: n
Basic operation: multiplication
Recurrence relation: M(n) = M(n-1) + 1
M(0) = 0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-33
Solving the recurrence for M(n)
M(n) = M(n-1) + 1
= (M(n-2) + 1) + 1 = M(n-2) + 2
= (M(n-3) + 1) + 2 = M(n-3) + 3
…
= M(n-i) + i
= M(0) + n
=n
The method is called backward substitution.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-34
Example 2: The Tower of Hanoi Puzzle
1 3
M(n) = 2M(n-1) + 1
= 2(2M(n-2) + 1) + 1 = 2^2*M(n-2) + 2^1 + 2^0
= 2^2*(2M(n-3) + 1) + 2^1 + 2^0
= 2^3*M(n-3) + 2^2 + 2^1 + 2^0
=…
= 2^(n-1)*M(1) + 2^(n-2) + … + 2^1 + 2^0
= 2^(n-1) + 2^(n-2) + … + 2^1 + 2^0
= 2^n -1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-36
Tree of calls for the Tower of Hanoi Puzzle
n
n-1 n-1
1 1 1 1 1 1 1 1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-37
Example 3: Counting #bits
A(n) = A( n / 2 ) + 1, A(1) = 0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-39
Fibonacci numbers
The Fibonacci numbers:
0, 1, 1, 2, 3, 5, 8, 13, 21, …
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-40
Solving aX(n) + bX(n-1) + cX(n-2) = 0
Set up the characteristic equation (quadratic)
ar2 + br + c = 0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-41
Application to the Fibonacci numbers
Characteristic equation: r 2 - r -1 = 0
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-42
Computing Fibonacci numbers
1. Definition-based recursive algorithm
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-43
Important Recurrence Types
Decrease-by-one recurrences
• A decrease-by-one algorithm solves a problem by exploiting a relationship
between a given instance of size n and a smaller size n – 1.
• Example: n!
• The recurrence equation for investigating the time efficiency of such
algorithms typically has the form
T(n) = T(n-1) + f(n)
Decrease-by-a-constant-factor recurrences
• A decrease-by-a-constant-factor algorithm solves a problem by dividing its
given instance of size n into several smaller instances of size n/b, solving
each of them recursively, and then, if necessary, combining the solutions to
the smaller instances into a solution to the given instance.
• Example: binary search.
• The recurrence equation for investigating the time efficiency of such
algorithms typically has the form
T(n) = aT(n/b) + f (n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-44
Decrease-by-one Recurrences
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-45
Decrease-by-a-constant-factor recurrences –
The Master Theorem
1. a < bk T ( n) ∈ Θ ( n k)
2. a = bk T ( n) ∈ Θ(nk log n )
3. a > bk T ( n) ∈ Θ(nlog a)
b
4. Examples:
1. T(n) = T(n/2) + 1
2. T(n) = 2T(n/2) + n Θ(log n)
3. T(n) = 3T(n/2) + n Θ(nlog n)
4. T(n) = T(n/2) + n Θ(nlog23)
Θ(n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 2 2-46