0% found this document useful (0 votes)
12 views26 pages

Asymptotic Notations

The document discusses the efficiency of algorithms, emphasizing the importance of asymptotic analysis and notations such as Big O, Big Omega, and Big Theta for measuring time and space complexity. It explains the properties of these notations, including reflexive, transitive, and symmetric properties, along with examples. Additionally, it introduces little-o and little-omega notations and provides algorithms for computing prefix averages, highlighting the significance of logarithmic and exponential properties in algorithm analysis.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views26 pages

Asymptotic Notations

The document discusses the efficiency of algorithms, emphasizing the importance of asymptotic analysis and notations such as Big O, Big Omega, and Big Theta for measuring time and space complexity. It explains the properties of these notations, including reflexive, transitive, and symmetric properties, along with examples. Additionally, it introduces little-o and little-omega notations and provides algorithms for computing prefix averages, highlighting the significance of logarithmic and exponential properties in algorithm analysis.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Course Handler: S.

Padmavathi
 The efficiency of an algorithm depends on the amount of time,
storage and other resources required to execute the algorithm. The
efficiency is measured with the help of asymptotic notations.
 An algorithm may not have the same performance for different types
of inputs. With the increase in the input size, the performance will
change.
 The study of change in performance of the algorithm with the
change in the order of the input size is defined as asymptotic
analysis.
 Asymptotic notations are the mathematical notations used to
describe the running time of an algorithm when the input tends
towards a particular value or a limiting value. For example: In
bubble sort, when the input array is already sorted, the time
taken by the algorithm is linear i.e. the best case
 Asymptotic notations provides with a mechanism to calculate
and represent time and space complexity for any algorithm. It is
of 3 types - Theta, Big O and Omega.
 Big-Oh (O) notation gives an upper bound for a function f(n) to within a constant factor.
 f(n) = O(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n)
always lies on or below c*g(n).
 O(g(n)) = { f(n) : There exist positive constant c and n0 such that
0 ≤ f(n) ≤ c g(n), for all n ≥ n0}
Thus, it gives the worst-case complexity of an algorithm.
Big Omega (Ω) Notation

Big-Omega (Ω) notation gives a lower bound for a function f(n) to within a constant factor.
Thus, it provides the best case complexity of an algorithm.
f(n) = Ω(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n)
always lies on or above c*g(n).
Ω(g(n)) = { f(n) : There exist positive constant c and n0 such that
0 ≤ c g(n) ≤ f(n), for all n ≥ n0}
Big-Theta(Θ) notation

• Big-Theta(Θ) notation gives bound for a function f(n) to within a constant factor.
• f(n) = Θ(g(n)), If there are positive constants n0 and c1 and c2 such that, to the right of
n0 the f(n) always lies between c1*g(n) and c2*g(n) inclusive.
• Θ(g(n)) = {f(n) : There exist positive constant c1, c2 and n0 such that
0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n), for all n ≥ n0}
• Since it represents the upper and the lower bound of the running time of an algorithm, it is
used for analyzing the average-case complexity of an algorithm.
Properties of Asymptotic Notations

1. General Properties :
If f(n) is O(g(n)) then a*f(n) is also O(g(n)) ; where a is a constant.
Example: f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5) = 14n²+35 is also O(n²) .
Similarly this property satisfies for both Θ and Ω notation.

We can say
If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)) ; where a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)) ; where a is a constant.
Properties of Asymptotic Notations
Reference:
2. Transitive Properties :
If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) = O(h(n)) .
Example: if f(n) = n, g(n) = n² and h(n)=n³
n is O(n²) and n² is O(n³)
then n is O(n³)
Similarly this property satisfies for both Θ and Ω notation.
We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))
Properties of Asymptotic Notations

3. Reflexive Properties :
Reflexive properties are always easy to understand after transitive.
If f(n) is given then f(n) is O(f(n)). Since MAXIMUM VALUE OF f(n) will be
f(n) ITSELF !
Hence x = f(n) and y = O(f(n) tie themselves in reflexive relation always.
Example: f(n) = n² ; O(n²) i.e O(f(n))
Similarly this property satisfies for both Θ and Ω notation.
We can say that:
If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).
Properties of Asymptotic Notations
4. Symmetric Properties :

If f(n) is Θ(g(n)) then g(n) is Θ(f(n)) .

Example: f(n) = n² and g(n) = n²


then f(n) = Θ(n²) and g(n) = Θ(n²)

This property only satisfies for Θ notation.


5. Transpose Symmetric Properties :

If f(n) is O(g(n)) then g(n) is Ω (f(n)).

Example: f(n) = n , g(n) = n²


then n is O(n²) and n² is Ω (n)
This property only satisfies for O and Ω notations.
Properties of Asymptotic Notations

Some More Properties :


1.) If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))
2.) If f(n) = O(g(n)) and d(n)=O(e(n))
then f(n) + d(n) = O( max( g(n), e(n) ))
Example: f(n) = n i.e O(n)
d(n) = n² i.e O(n²)
then f(n) + d(n) = n + n² i.e O(n²)
3.) If f(n)=O(g(n)) and d(n)=O(e(n))
then f(n) * d(n) = O( g(n) * e(n) )
Example: f(n) = n i.e O(n)
d(n) = n² i.e O(n²)
then f(n) * d(n) = n * n² = n³ i.e O(n³)
Take functions f(n) & g(n), consider only the most significant
term and remove constant multipliers:
◦ 5n+3 → n
◦ 7n+.5n2+2000 → n2
◦ 300n+12+nlogn → n log n
◦ –n → ??? A negative run-time?
Then compare the functions; if f(n) ≤ g(n), then f(n) is in
O(g(n))
Theorem : For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = W(g(n)).

 (g(n)) = O(g(n))  W(g(n))

 In practice, asymptotically tight bounds are obtained


from asymptotic upper and lower bounds.
For a given function g(n), the set little-o:
o(g(n)) = {f(n):  c > 0,  n0 > 0 such that  n  n0, we
have 0  f(n) < cg(n)}.

Intuitively: Set of all functions whose rate of growth is lower than that of g(n).

g(n) is an upper bound for f(n)that is not


asymptotically tight.
For a given function g(n), the set little-omega:
w(g(n)) = {f(n):  c > 0,  n0 > 0 such that  n  n0, we
have 0  cg(n) < f(n)}.

Intuitively: Set of all functions whose rate of growth is higher than that of g(n).

g(n) is a lower bound for f(n) that is not


asymptotically tight.
f  g ~ a  b

f (n) = O(g(n)) ~ a  b
f (n) = W(g(n)) ~ a  b
f (n) = (g(n)) ~ a = b
f (n) = o(g(n)) ~ a < b
f (n) = w(g(n)) ~ a > b
o-notation (little-o):
– f(n) becomes insignificant relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] = 0
n

w -notation (little-omega):
– f(n) becomes arbitrarily large relative to g(n) as n approaches
infinity:
lim [f(n) / g(n)] = 
n
-notation (Big-theta):
– f(n) relative to g(n)equals a constant, c, which is greater than 0
and less than infinity as n approaches infinity:
0 < lim [f(n) / g(n)] < 
n
O-notation (Big-o):
– f(n)  O(g(n))  f(n)  (g(n))and f(n)  o(g(n))
– f(n) relative to g(n)equals some value less than infinity as n approaches infinity:
lim [f(n) / g(n)] < 
n

W -notation (Big-omega):
– f(n)  W(g(n))  f(n)  (g(n))and f(n)  w(g(n))
– f(n) relative to g(n)equals some value greater than 0 as n approaches infinity:
0 < lim [f(n) / g(n)]
n
Use limit definitions to prove:

◦ 10n - 3n  O(n2)

◦ 3n4  W(n3)

◦ n2/2 - 3n  (n2)

◦ 22n (2n)
Use limit definitions to prove:

◦ 10n - 3n  O(n2) – Yes!


lim [ 10n - 3n / n2 ] = 0
n

◦ 3n4  W(n3) – Yes!


lim [ 3n4 / n3 ] = 
n

◦ n2/2 - 3n  (n2) – Yes!


lim [ n2/2 - 3n / n2 ] = 1/2
n

◦ 22n (2n) – No!


lim [ 22n / 2n ] = 
n
An algorithm for computing prefix averages
Algorithm prefixAverages1(X):
Input: An n-element array X of numbers.
Output: An n -element array A of numbers such that A[i] is the average of elements
X[0], ... , X[i].
Let A be an array of n numbers.
for i 0 to n - 1 do
a0
for j  0 to i do
a  a + X[j]
A[i]  a/(i+ 1)
return array A 1 step i iterations with n iterations
 Analysis ... i=0,1,2...n-1

23
 A better algorithm for computing prefix averages:
Algorithm prefixAverages2(X):
Input: An n-element array X of numbers.
Output: An n -element array A of numbers such that A[i] is the average of
elements X[0], ... , X[i].
Let A be an array of n numbers.
s 0
for i  0 to n do
s  s + X[i]
A[i]  s/(i+ 1)
return array A
 Analysis ...

Analysis of Algorithms 24
 properties of logarithms:
logb(xy) = logbx + logby
logb (x/y) = logbx - logby
logbxa = alogbx
logba= logxa/logxb
 properties of exponentials:
a(b+c) = aba c
abc = (ab)c
ab /ac = a(b-c)
b = a logab
bc = a c*logab

Analysis of Algorithms 25
 Floor: x = the largest integer ≤ x
 Ceiling: x = the smallest integer ≥ x
 Summations: (see Appendix A, p.619)
 Geometric progression: (see Appendix A, p.620)

Analysis of Algorithms 26

You might also like