0% found this document useful (0 votes)
43 views

O-Notation: For A Given Function G (N), We Denote by o (G (N) ) The Set of Functions

This document discusses big O notation and describes how it is used to analyze the time complexity of algorithms. It provides examples of common time complexities such as constant time O(1), logarithmic time O(log n), linear time O(n), quadratic time O(n^2), and exponential time O(2^n). It also discusses how complexity classes such as polynomial time and exponential time compare to each other. The document concludes with an example of using summation notation to calculate the time complexity of an algorithm to find the maximum subvector sum.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

O-Notation: For A Given Function G (N), We Denote by o (G (N) ) The Set of Functions

This document discusses big O notation and describes how it is used to analyze the time complexity of algorithms. It provides examples of common time complexities such as constant time O(1), logarithmic time O(log n), linear time O(n), quadratic time O(n^2), and exponential time O(2^n). It also discusses how complexity classes such as polynomial time and exponential time compare to each other. The document concludes with an example of using summation notation to calculate the time complexity of an algorithm to find the maximum subvector sum.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 29

o-notation

 For a given function g(n), we denote by o(g(n))


the set of functions:
o(g(n)) = {f(n): for any positive constant c > 0,
there exists a constant n0 > 0 such that 0  f(n)
< cg(n) for all n  n0 }

 f(n) becomes insignificant relative to g(n) as n


approaches infinity: lim [f(n) / g(n)] = 0
n

 We say g(n) is an upper bound for f(n) that is


not asymptotically tight.

1
)*(O(*) versus o
O(g(n)) = {f(n): there exist positive constants c and n0 such
that 0  f(n)  cg(n), for all n  n0 }.
o(g(n)) = {f(n): for any positive constant c > 0, there exists a
constant n0 > 0 such that 0  f(n) < cg(n) for all n  n0 }.

Thus o(f(n)) is a weakened O(f(n)).


For example: n2 = O(n2)
n2  o(n2)
n2 = O(n3)
n2 = o(n3)

2
o-notation
 n1.9999 = o(n2)
 n2/ lg n = o(n2)
 n2  o(n2) (just like 2< 2)
 n2/1000  o(n2)

3
-notation
 For a given function g(n), we denote by (g(n))
the set of functions
(g(n)) = {f(n): for any positive constant c > 0,
there exists a constant n0 > 0 such that 0 
cg(n) < f(n) for all n  n0 }
 f(n) becomes arbitrarily large relative to g(n) as
n approaches infinity: lim [f(n) / g(n)] = 
n

 We say g(n) is a lower bound for f(n) that is not


asymptotically tight.

4
-notation
 n2.0001 = ω(n2)
 n2 lg n = ω(n2)
 n2  ω(n2)

5
Comparison of Functions

fg  ab

f (n) = O(g(n))  a  b
f (n) = (g(n))  a  b
f (n) = (g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = (g(n))  a > b

6
Properties
 Transitivity
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))

 Symmetry
f(n) = (g(n)) if and only if g(n) = (f(n))

 Transpose Symmetry
f(n) = O(g(n)) if and only if g(n) = (f(n))

7
Practical Complexities
 Is O(n2) too much time?
 Is the algorithm practical?
2 3
n n nlogn n n
1000 1mic 10mic 1milli 1sec

10000 10mic 130mic 100milli 17min

106 1milli 20milli 17min 32years

At CPU speed 109 instructions/second


8
Impractical Complexities
4 10 n
n n n 2
1000 17min 3.2 x 1013 3.2 x 10283
years years

10000 116 ??? ???


days

10^6 3 x 10^7 ?????? ??????


years
At CPU speed 109 instructions/second
9
Some Common Name for
Complexity
Constant time O(1)
Logarithmic time O(log n)
Log-squared time O(log2 n)
Linear time O(n)
Quadratic time O(n2)
Cubic time O(n3)
Polynomial time O(ni ) for some i
Exponential time O(2n)

10
Growth Rates of some Functions
O log n   O  log n   O  n   O n 
2

Polynomial
Functions
 O n log n   O  n log n   O  n   O  n 
2 1.5 2

 O n   O n 
3 4

O n c   O  2c log n  for any constant c


 O n   O2  log 2 n

Exponential
log n

Functions
 O 2   O3   O 4 
n n n

 O n!  O  n 
n

11
Effect of Multiplicative Constant
800
f(n)=n2

600
Run time

400
f(n)=10n

200

0 10 20 25 n
12
Exponential Functions
 Exponential functions increase rapidly, e.g.,
2n will double whenever n is increased by 1.

n 2n 1s x 2n
10 103 0.001 s
20 106 1s
30 109 16.7 mins
40 1012 11.6 days
50 1015 31.7 years
60 1018 31710 years

13
Practical Complexity
250

f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

14
Practical Complexity
500

f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

15
Practical Complexity
1000

f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n

0
1 3 5 7 9 11 13 15 17 19

16
Practical Complexity
5000

4000
f(n) = n
f(n) = log(n)
3000
f(n) = n log(n)
f(n) = n^2
2000 f(n) = n^3
f(n) = 2^n

1000

0
1 3 5 7 9 11 13 15 17 19

17
Floors & Ceilings
 For any real number x, we denote the greatest
integer less than or equal to x by x
 read “the floor of x”

 For any real number x, we denote the least integer


greater than or equal to x by x
 read “the ceiling of x”

 For all real x, (example for x=4.2)


 x – 1  x  x  x  x + 1

 For any integer n ,


 n/2 + n/2 = n

18
Polynomials
 Given a positive integer d, a polynomial in
n of degree d is a function P(n) of the form
d

P(n) =  i
i
 a n
i 0

 where a0, a1, …, ad are coefficient of the polynomial


 ad  0

 A polynomial is asymptotically positive iff


ad  0
 Also P(n) = (nd)
19
Exponents
 x0 = 1 x1 = x x-1 = 1/x

 xa . xb = xa+b

 xa / xb = xa-b

 (xa)b = (xb)a = xab

 xn + xn = 2xn  x2n

 2n + 2n = 2.2n = 2n+1

20
Logarithms (1)
 In computer science, all logarithms are to
base 2 unless specified otherwise
 xa = b iff logx(b) = a
 lg(n) = log2(n)
 ln(n) = loge(n)
 lgk(n) = (lg(n))k
 loga(b) = logc(b) / logc(a) ; c0
 lg(ab) = lg(a) + lg(b)
 lg(a/b) = lg(a) - lg(b)
 lg(ab) = b . lg(a)
21
Logarithms (2)
 a = blogb(a)
 alogb(n) = nlogb(a)
 lg (1/a) = - lg(a)

 logb(a) = 1/loga(b)

 lg(n)  n for all n  0


 loga(a) = 1

 lg(1) = 0, lg(2) = 1, lg(1024=210) = 10


 lg(1048576=220) = 20
22
Summation
 Why do we need to know this?
We need it for computing the running time of a
given algorithm.

 Example: Maximum Sub-vector


Given an array a[1…n] of numeric values (can be
positive, zero and negative) determine the sub-
vector a[i…j] (1 i  j  n) whose sum of
elements is maximum over all sub-vectors.

23
Example: Max Sub-Vectors
MaxSubvector(a, n) {
maxsum = 0;
for i = 1 to n {
for j = i to n {
sum = 0;
for k = i to j { sum += a[k] }
maxsum = max(sum, maxsum);
}
}
return maxsum;
} n n j
T ( n)     1
i 1 j i k i
24
Summation
n


k 1
k  1  2  ...  n  n(n  1) / 2  (n 2 )
n 1
n
x 1
x
k 0
k
 1  x  x ...  x 
2 n

x 1
n n n

 (ca
k 1
k  bk )  c  ak   bk
k 1 k 1
n

 (a
k 1
k  ak 1 )  an  a0 , for a0 , a1 ,..., an
n 1

 (a
k 0
k  ak 1 )  a0  an , for a0 , a1 ,..., an

25
Summation
 Constant Series: For a, b  0,
b

1  b  a  1
i a
 Quadratic Series: For n  0,
n

 i 2

i 1
 12
 2 2
 ...  n 2
 ( 2 n 3
 3n 2
 n) / 6

 Linear-Geometric Series: For n  0,


n

 ic
i 1
i
 c  2c 2
 ...  nc n
 [( n  1) c n 1
 nc n
] /( c  1) 2

26
Series

27
Proof of Geometric series
A Geometric series is one in which the sum
approaches a given number as N tends to
infinity.
Proofs for geometric series are done by
cancellation, as demonstrated.

28
Factorials
 n! (“n factorial”) is defined for integers n
 0 as

 n! = 1 if n  0,

n.(n  1)! if n  0
 n! = 1 . 2 .3 … n

 n! < nn for n ≥ 2

29

You might also like