0% found this document useful (0 votes)
8 views12 pages

Asymptoticnotation 180105065247

The document explains asymptotic notations, including Big O, Big Omega, and Big Theta, which describe the limiting behavior of functions and the time complexity of algorithms as input size approaches infinity. It outlines the definitions, properties, and examples of these notations, emphasizing their roles in determining upper and lower bounds for algorithmic performance. Additionally, it covers basic asymptotic efficiency classes and the relationships between the different notations.

Uploaded by

hell raiser
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views12 pages

Asymptoticnotation 180105065247

The document explains asymptotic notations, including Big O, Big Omega, and Big Theta, which describe the limiting behavior of functions and the time complexity of algorithms as input size approaches infinity. It outlines the definitions, properties, and examples of these notations, emphasizing their roles in determining upper and lower bounds for algorithmic performance. Additionally, it covers basic asymptotic efficiency classes and the relationships between the different notations.

Uploaded by

hell raiser
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Asymptotic Notations

Introduction
 In mathematics, computer science, and related fields, big O
notation describes the limiting behavior of a function when the
argument tends towards a particular value or infinity, usually in
terms of simpler functions.

 The time complexity of an algorithm quantifies the amount of


time taken by an algorithm to run as a function of the size of the
input to the problem. When it is expressed using big O notation,
the time complexity is said to be described asymptotically, i.e.,
as the input size goes to infinity.
Asymptotic Complexity
 Running time of an algorithm as a function of input size n for large
n.
 Expressed using only the highest-order term in the expression for
the exact running time.
◦ Instead of exact running time, say Q(n2).
 Describes behavior of function in the limit.
 Written using Asymptotic Notation.
 The notations describe different rate-of-growth relations between
the defining function and the defined set of functions.
Big O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:

O(g(n)) = {f(n) :
 positive constants c and n0, such
that n  n0,
we have f(n)  cg(n) }

g(n) is an asymptotic upper bound


for f(n).

Example:
i) f(n)=3n+2 and g(n)=n Prove f(n)  cg(n)
ii) f(n)=100n+5 and g(n)=n2 Prove f(n)  cg(n)
Big O Notation

f(n)=n^2 g(n)=2^n
 -notation
For function g(n), we define (g(n)), big-
Omega of n, as the set:
(g(n)) = {f(n) :
 positive constants c and n0, such
that n  n0,
we have cg(n)  f(n) or
f(n)>=cg(n)}

g(n) is an asymptotic lower bound


for f(n).

Example:
i) f(n)=3n+2 and g(n)=n Prove f(n) >= cg(n)
ii) f(n)=n3 and g(n)=n2 Prove f(n) >= cg(n)
Big Omega Notation
Q-notation
For function g(n), we define Q(g(n)), big-
Theta of n, as the set:

Q(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have c1g(n)  f(n)  c2g(n)
}

g(n) is an asymptotically tight bound


for f(n).

Example:
i) f(n)=3n+2 and g(n)=n Prove
Relations Between Q, O, 
Basic Asymptotic Efficiency classes
C constant, we write
O(1)
logN logarithmic
log2N log-squared
N linear
NlogN
N2 quadratic
N3 cubic
2N exponential
N! factorial
Properties of Asymptotic Notation
1. Transitive
If f(n) = Θ(g(n)) and g(n) = Θ(h(n)), then f(n) = Θ(h(n))
If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n))
If f(n) = Ω(g(n)) and g(n) = Ω(h(n)), then f(n) = Ω(h(n))

2. Reflexivity
f(n) = Θ(f(n))
f(n) = O(f(n))
f(n) = Ω(f(n))

3. Symmetry
f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))

4. Transpose Symmetry
f(n) = O(g(n)) if and only if g(n) = Ω(f(n))
The End

You might also like