04_Algorithm_Analysis_Asymptotic Notation_Growth of Functions
04_Algorithm_Analysis_Asymptotic Notation_Growth of Functions
3 -40 40 0 0
4 -10 0 10 0
3-SUM: brute-force algorithm
Run the program for various input sizes and measure running
time.
N time (seconds) †
250 0
500 0
1,000 0.1
2,000 0.8
4,000 6.4
8,000 51.1
16,000 ?
Data analysis
nanoseconds
operation example †
Programming Constructs:
• decision structures: if ... then ... [else ..]
• while-loops while ... do
• repeat-loops: repeat ... until ...
• for-loop: for ... do
• array indexing: A[i]
Methods
• calls: object method(args)
• returns: return value
Use comments /* this is a comment */ or // this is also a comment
Instructions have to be basic enough and feasible!
Asymptotic Notations
Asymptotic Notations
• Categorize algorithms based on asymptotic growth rate e.g.
linear, quadratic, polynomial, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth rate of time
complexity function
• Describe running time of algorithm as n grows to ∞.
• Describes behavior of function within the limit.
Limitations
• Not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
Asymptotic Notation
Goal: to simplify analysis by getting rid of unneeded information
• like “rounding” 1,000,001≈1,000,000
Intuitively:
Set of all functions whose rate of growth is the same as or lower
than that of g(n).
Big-Oh Notation (O)
f(n) Î O(g(n))
f(n) ≤ c.g(n)
⇒ 2n2 ≤ c.n3
⇒ 2 ≤ c.n
if we take, c = 1 and n0= 2 OR c = 2 and n0= 1
then
2n2 ≤ c.n3
Hence f(n) Î O(g(n)), c = 1 and n0= 2
Big-Oh Notation (O) - Examples
Example 2: Prove that n2 Î O(n2)
Proof:
We have f(n) = n2 , and g(n) = n2
f(n) Î O(g(n)) ?
Now we have to find the existence of c and n0 such that 𝟎 ≤f(n)≤
𝒄. 𝒈 𝒏 ∀ 𝒏 ≥ 𝒏𝒐
𝟏 𝟐 𝟏 𝟑
𝒏 − 𝟑𝒏 ≤ 𝒄𝒏𝟐 if 𝒄≥ − which holds for c = 1/3 and n > 1
𝟑 𝟑 𝒏
𝟎 ≤ n3 ≤ c.n2
⇒ n≤ c
Since c is any fixed number and n is any arbitrary constant, therefore n ≤
c is not possible in general.
Hence our assumption is wrong and 𝒏𝟑 ≤ 𝒄𝒏𝟐 ∀𝒏 ≥ 𝒏𝟎 is not true of any
combination of 𝒄 and 𝒏𝟎
Therefore, f(n) ∉ O(g(n))
Some More Examples
1. n2 + n3 = O(n4)
2. n2 / log(n) = O(n . log n)
3. 5n + log(n) = O(n)
4. nlog n = O(n100)
5. 3n = O(2n . n100)
6. n! = O(3n)
7. n +1 = O(n)
8. 2n+1 = O(2n)
9. (n+1)! = O(n!)
10. 1 + c + c2 +…+ cn = O(cn) for c > 1
11. 1 + c + c2 +…+ cn = O(1) for c < 1
Big-Omega Notation (𝛀) – Lower
Bounds
For a given function 𝒈(𝒏), denoted by (𝛀𝒈(𝒏)), the
set of functions,
Intuitively:
Set of all functions whose rate of growth is the same as or higher
than that of g(n).
Big-Omega Notation (𝛀)
f(n) Î 𝛀(g(n))
This means that they have essentially the same growth rates for
large 𝒏.
Theta Notation
f(n) Î Q(g(n))
f(n) = 8n2 + 2n – 3
Our informal rule of keeping the largest term and ignoring the constant
suggests that f(n) ∈ 𝜽(n2). Let’s see why this bears out formally.
We need to show two things for f(n) = 8n2 + 2n – 3:
1) Lower bound: f(n) = 8n2 + 2n - 3 grows asymptotically at least as
fast as n2,
2) Upper bound: f(n) grows no faster asymptotically than n2
Theta Notation
Example (…continued)
f(n) = 8n2 + 2n – 3
1) Lower bound: f(n) grows asymptotically at least as fast as n2.
For this, need to show that there exist positive constants c1 and n0,
such that f(n) ≥ c1n2 for all 𝒏 ≥ 𝒏𝟎 .
Consider the reasoning:
f(n) = 8n2 + 2n - 3
≥ 8n2 - 3 ≥ 7n2 + (n2 - 3) ≥ 7n2
Thus c1 = 7. We implicitly assumed that 2n ≥ 0 and n2 - 3 ≥ 0.
These are not true for all n but if n ≥ √𝟑, then both are true.
Therefore, selecting n" ≥ √𝟑, we then have f(n) ≥ c1n2 for all 𝒏 ≥ 𝒏𝟎
Theta Notation
Example (…continued)
f(n) = 8n2 + 2n – 3
2) Upper bound: f(n) grows asymptotically no faster than n2.
For this, need to show that there exist positive constants c1 and n0, such
that f(n) ≤ c2n2 for all 𝒏 ≥ 𝒏𝟎 .
Consider the reasoning:
f(n) = 8n2 + 2n - 3
≤ 8n2 + 2n
≤ 8n2 + 2n2
= 10n2
Thus c2 = 10. We implicitly assumed that 2n ≤ 2n2. These are not true for all
n but it is true for n ≥ 𝟏.
Therefore, selecting n) ≥ 𝟏, we then have f(n) ≤ c2n2 for all 𝒏 ≥ 𝒏𝟎
Theta Notation
Example (…continued)
f(n) = 8n2 + 2n – 3
From lower bound we have n0 ³ Ö3.
From upper bound we have n0 ³ 1.
Combining the two, we let n0 be the larger of the two i.e., n0 ³ Ö3.
In conclusion, if we let c1 = 7, c2 = 10 and n0 = Ö3,
we have,
7n2 £ 8n2 + 2n - 3 £ 10n2 for all n ³ Ö3
Thus, we have established that
0 £ c1 g(n) £ f(n) £ c2 g(n) for all n ³ n0
Usefulness of Notation
It is not always possible to determine behaviour of an algorithm
using 𝜽-notation.
For example, given a problem with n inputs, we may have an
algorithm to solve it in 𝒄𝟏 .n2 time when n is even and 𝒄𝟐 .n time
when n is odd. OR
We may prove that an algorithm never uses more than 𝒄𝟏 .n2
time and never less than 𝒄𝟐 .n time.
In either case we can neither claim 𝜽(n) nor 𝜽(n2) to be the order
of the time usage of the algorithm.
Big O and W notation will allow us to give at least partial
information