Analysis of Algorithms (PT 2) (Chapter 4) : COMP53 Oct 3, 2007
Analysis of Algorithms (PT 2) (Chapter 4) : COMP53 Oct 3, 2007
(pt 2)
(Chapter 4)
COMP53
Oct 3, 2007
Methods of Analysis
Experimental Studies: Run
experiments on implementations of
algorithms and record actual time or
operation counts.
Theoretical Analysis: Determine
time or operation counts from
mathematical analysis of the algorithm
doesnt require an implementation
(or even a computer)
Theoretical Analysis
Uses a high-level description of the
algorithm instead of an implementation
Characterizes running time as a
function of the input size, n.
Takes into account all possible inputs
Allows us to evaluate the speed of an
algorithm independent of the
hardware/software environment
Constant 1
Logarithmic log n
Linear n
N-Log-N n log n
Quadratic n2
Cubic n3
Exponential 2n
Growth Functions
Reasonable Time
Assume GHz machine: 106
operations/second
Minut Hour
Day
e
108 ops 109 ops 1011
ops
Month Year
1012
ops
1013
ops
Growth Functions
< day
< month
> year
Big-Oh Notation
f(n) is O(g(n)) if there are positive constants c and n0
such that
f(n) cg(n) for n n0
Example:
2n 10 O(n)
2n 10 cn
for c 3 and n >= n010
Big-Oh Example
Example:
n2 is not O(n)
There is no value of c such
that n2 cn, as n
Asymptotic Algorithm
Analysis
The asymptotic analysis of an algorithm
Example:
We determine that algorithm arrayMax executes
at most 8n 2 primitive operations
We say that algorithm arrayMax runs in O(n)
time
35
X
A
30
25
20
15
10
5
0
1
2 3
5 6
basic operation:
addition
Algorithm 1 Analysis
for i 0 to n 1 do
for j 1 to i do
s s X[j]
basic operation:
addition
Algorithm 2 Analysis
for i 0 to n 1 do
s s + X[i]
Number of additions is 1 1 1 = n
Algorithm prefixAverages2 O(n)
Relatives of Big-Oh
big-Omega
f(n) (g(n))
if f(n) cg(n) for n n0
big-Theta
f(n) (g(n))
if cg(n) f(n) cg(n) for n
n0
f(n) O(g(n))
f(n) (g(n))
f(n) is asymptotically bigger or
equal to g(n)
f(n) (g(n))
Big-Oh
Big-Omega
Big-Theta
is (n2)
5n2
is (n)
5n2
is (n2)