Asymptotic Notations
Asymptotic Notations
Padmavathi
The efficiency of an algorithm depends on the amount of time,
storage and other resources required to execute the algorithm. The
efficiency is measured with the help of asymptotic notations.
An algorithm may not have the same performance for different types
of inputs. With the increase in the input size, the performance will
change.
The study of change in performance of the algorithm with the
change in the order of the input size is defined as asymptotic
analysis.
Asymptotic notations are the mathematical notations used to
describe the running time of an algorithm when the input tends
towards a particular value or a limiting value. For example: In
bubble sort, when the input array is already sorted, the time
taken by the algorithm is linear i.e. the best case
Asymptotic notations provides with a mechanism to calculate
and represent time and space complexity for any algorithm. It is
of 3 types - Theta, Big O and Omega.
Big-Oh (O) notation gives an upper bound for a function f(n) to within a constant factor.
f(n) = O(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n)
always lies on or below c*g(n).
O(g(n)) = { f(n) : There exist positive constant c and n0 such that
0 ≤ f(n) ≤ c g(n), for all n ≥ n0}
Thus, it gives the worst-case complexity of an algorithm.
Big Omega (Ω) Notation
Big-Omega (Ω) notation gives a lower bound for a function f(n) to within a constant factor.
Thus, it provides the best case complexity of an algorithm.
f(n) = Ω(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n)
always lies on or above c*g(n).
Ω(g(n)) = { f(n) : There exist positive constant c and n0 such that
0 ≤ c g(n) ≤ f(n), for all n ≥ n0}
Big-Theta(Θ) notation
• Big-Theta(Θ) notation gives bound for a function f(n) to within a constant factor.
• f(n) = Θ(g(n)), If there are positive constants n0 and c1 and c2 such that, to the right of
n0 the f(n) always lies between c1*g(n) and c2*g(n) inclusive.
• Θ(g(n)) = {f(n) : There exist positive constant c1, c2 and n0 such that
0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n), for all n ≥ n0}
• Since it represents the upper and the lower bound of the running time of an algorithm, it is
used for analyzing the average-case complexity of an algorithm.
Properties of Asymptotic Notations
1. General Properties :
If f(n) is O(g(n)) then a*f(n) is also O(g(n)) ; where a is a constant.
Example: f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5) = 14n²+35 is also O(n²) .
Similarly this property satisfies for both Θ and Ω notation.
We can say
If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)) ; where a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)) ; where a is a constant.
Properties of Asymptotic Notations
Reference:
2. Transitive Properties :
If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) = O(h(n)) .
Example: if f(n) = n, g(n) = n² and h(n)=n³
n is O(n²) and n² is O(n³)
then n is O(n³)
Similarly this property satisfies for both Θ and Ω notation.
We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))
Properties of Asymptotic Notations
3. Reflexive Properties :
Reflexive properties are always easy to understand after transitive.
If f(n) is given then f(n) is O(f(n)). Since MAXIMUM VALUE OF f(n) will be
f(n) ITSELF !
Hence x = f(n) and y = O(f(n) tie themselves in reflexive relation always.
Example: f(n) = n² ; O(n²) i.e O(f(n))
Similarly this property satisfies for both Θ and Ω notation.
We can say that:
If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).
Properties of Asymptotic Notations
4. Symmetric Properties :
Intuitively: Set of all functions whose rate of growth is lower than that of g(n).
Intuitively: Set of all functions whose rate of growth is higher than that of g(n).
f (n) = O(g(n)) ~ a b
f (n) = W(g(n)) ~ a b
f (n) = (g(n)) ~ a = b
f (n) = o(g(n)) ~ a < b
f (n) = w(g(n)) ~ a > b
o-notation (little-o):
– f(n) becomes insignificant relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] = 0
n
w -notation (little-omega):
– f(n) becomes arbitrarily large relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] =
n
-notation (Big-theta):
– f(n) relative to g(n)equals a constant, c, which is greater than 0
and less than infinity as n approaches infinity:
0 < lim [f(n) / g(n)] <
n
O-notation (Big-o):
– f(n) O(g(n)) f(n) (g(n))and f(n) o(g(n))
– f(n) relative to g(n)equals some value less than infinity as n approaches infinity:
lim [f(n) / g(n)] <
n
W -notation (Big-omega):
– f(n) W(g(n)) f(n) (g(n))and f(n) w(g(n))
– f(n) relative to g(n)equals some value greater than 0 as n approaches infinity:
0 < lim [f(n) / g(n)]
n
Use limit definitions to prove:
◦ 10n - 3n O(n2)
◦ 3n4 W(n3)
◦ n2/2 - 3n (n2)
◦ 22n (2n)
Use limit definitions to prove:
23
A better algorithm for computing prefix averages:
Algorithm prefixAverages2(X):
Input: An n-element array X of numbers.
Output: An n -element array A of numbers such that A[i] is the average of
elements X[0], ... , X[i].
Let A be an array of n numbers.
s 0
for i 0 to n do
s s + X[i]
A[i] s/(i+ 1)
return array A
Analysis ...
Analysis of Algorithms 24
properties of logarithms:
logb(xy) = logbx + logby
logb (x/y) = logbx - logby
logbxa = alogbx
logba= logxa/logxb
properties of exponentials:
a(b+c) = aba c
abc = (ab)c
ab /ac = a(b-c)
b = a logab
bc = a c*logab
Analysis of Algorithms 25
Floor: x = the largest integer ≤ x
Ceiling: x = the smallest integer ≥ x
Summations: (see Appendix A, p.619)
Geometric progression: (see Appendix A, p.620)
Analysis of Algorithms 26