Basics of Algorithm Analysis: What Is The Running Time of An Algorithm
Basics of Algorithm Analysis: What Is The Running Time of An Algorithm
big-O
O(g(n)) = {f (n) : there exist positive constants c and n0 such that
0 ≤ f (n) ≤ cg(n) for all n ≥ n0} .
Alternatively, we say
big-Ω
INFORMAL summary
• f (n) = O(g(n)) roughly means f (n) ≤ g(n)
• f (n) = Ω(g(n)) roughly means f (n) ≥ g(n)
• f (n) = Θ(g(n)) roughly means f (n) = g(n)
• f (n) = o(g(n)) roughly means f (n) < g(n)
• f (n) = w(g(n)) roughly means f (n) > g(n)
We use these to classify algorithms into classes, e.g. n, n2, n log n, 2n.
Arithmetic series
n n(n + 1)
i=
X
i=1 2
Geometric series
∞ 1
ai = for 0 < a < 1
X
i=0 1−a
Harmonic series
n 1
= ln n + O(1) = Θ(ln n)
X
i=1 i
Algorithmic Correctness
Master Theorem for Recurrences Let a ≥ 1 and b > 1 be constants, let f (n)
be a function, and let T (n) be defined on the nonnegative integers by the
recurrence
T (n) = aT (n/b) + f (n) ,
where we interpret n/b to mean either bn/bc or dn/be. Then T (n) can be
bounded asymptotically as follows.
1. If f (n) = O(nlogb a−) for some constant > 0, then T (n) = Θ(nlogb a).
2. If f (n) = Θ(nlogb a), then T (n) = Θ(nlogb a lg n).
3. If f (n) = Ω(nlogb a+) for some constant > 0, and if af (n/b) ≤ cf (n) for
some constant c < 1 and all sufficiently large n, then T (n) = Θ(f (n)).