0% found this document useful (0 votes)
6 views

CH 2algo Analysis - Part2

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

CH 2algo Analysis - Part2

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

Algorithm Design and Analysis

Faculty of Information Technology - Computer Science Department 1


Algorithm Design and Analysis

Faculty of Information Technology - Computer Science Department 2


Chapter 2 – Part 2

Asymptotic Analysis and


Recurrence

Faculty of Information Technology - Computer Science Department 3


Asymptotic Analysis
 Algorithm Analysis
 The amount of resources used by the algorithm
- Space
- Running time (the number of primitive operations (steps) executed before termination)

 How do we compare algorithms?


 Express running time as a function of the input size n (i.e., f(n))
 Compare different functions corresponding to running times.

 To compare two algorithms with running times f(n) and g(n),we need a rough measure that
characterizes how fast each function grows.
 Compare functions in the limit, asymptotically(for large values of n)
 Use Rate of Growth

- Expresses the behavior of a function toward infinity


Faculty of Information Technology - Computer Science Department 4
Rate of Growth

• The low order terms in a function are relatively insignificant for large n

n4 + 100n2 + 10n + 50  n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate of growth.

Faculty of Information Technology - Computer Science Department 5


Common Functions

Faculty of Information Technology - Computer Science Department 6


Practical Complexities

• Is O(n2) too much time?


• Is the algorithm practical?

At CPU speed 10 9
instructions/second
Hierarchy of Functions

Faculty of Information Technology - Computer Science Department 8


Asymptotic Notations

• A way to describe behavior of functions in the limit


• How we indicate running times of algorithms

• Describe the running time of an algorithm as n grows to 

•  notation: asymptotic “greater than”: f(n) “≥” g(n)

•  notation: asymptotic “equality”: f(n) “=” g(n)


9
• O notation: asymptotic “less than”: f(n) “≤” g(n)
Asymptotic notations - Big-O notation (Upper Bound – Worst Case

 O-notation

• Intuitively: O(g(n)) = the set of


functions with a smaller or same
order of growth as g(n)
• f(n) = O(g(n)) iff:
f ( n)
0  lim n    
g ( n)

10
Examples

 2 n2 = O(n3) because
2 n2 ≤ cn3  2 ≤ cn which holds for c=1 and n0 = 1

 n2 = O(n2) because
n2 ≤ cn2  which holds for c ≥ 1 which holds for c=1 and n0 = 1

 1000n2 + 1000n= O(n2) because


1000n2 + 1000n ≤ 1000n2 + n2 = 1001n2 which holds for c = 1001 and n0 = 1000

 1/3 n2 – 3n  O(n2) because


1/3 n2 – 3n ≥cn2 if c ≤ 1/3 -3/n which holds for c=1/3 and n>1
Asymptotic notations - -notation (Omega)Lower Bound – Best Case

  - notation

• Intuitively: (g(n)) = the set of


functions with a larger or same order
of growth as g(n)
• f(n) = (g(n)) iff:
f ( n)
0  lim n    
g ( n)
Example

• 1/3 n2 – 3n = Ω(n2) because


1/3n2 – 3n ≤ cn2 if c ≥ 1/3 -3/n which is true if c=1/6 and n>18.

• K1n2 + k2n + k3= Ω(n) (Lower bound)

• Note:
- when we say “the running time is Ω(n2) we mean that the best case running
time is Ω(n2).
Asymptotic notations (cont.)
  -notation

• Intuitively (g(n)) = the set of


functions with the same order of
growth as g(n)

• f(n) = (g(n)) iff:


f ( n)
0  lim n    
g ( n)

14
Example
• 6n log n +n1/2 log n=Θ(n log n )

We need to find n0,c1,c2 such that


c1n log n ≤ 6n log n +n1/2 log n ≤ c2n log n for n > n 0

c1n log n ≤ 6n log n +n1/2 log n  c1≤ 6+(log n / n1/2 )


Ok. If we choose c1 = 6 and n0= 1
6n log n +n1/2 log n ≤ c2n log n  6+(log n / n1/2 )≤c2
So c1 = 6 , c2 =7 , n0 = 2 works
Examples (Cont…)

• n, n+1, n+80, 40n, n + log n, log n, 100 is O(n)


• n1.1 + 10000000000n, n+1 is O(n1.1)
• n2 , n log n, 100, 2n+ 2 is O(n2)
• 3n2 + 6n + log n + 24.5 is O(n2)

• n, n+1, n+80, 40n, n2, n log n is (n)


• n1.1 + 1000000n, 2n2 is (n1.1)
• 3n2 + 6n + log n, 24.5 n3 is (n2)

• 10n2 + 4n + 2 is Θ(n2)
• 6*2n + n2 is Θ(2n)
Logarithms

• In algorithm analysis we often use the notation “log n” without


specifying the base

Binary logarithm
lg n log2 n log k n (log n ) k
Natural logarithm ln n loge n log log n log(log n )
log x y  y log x
log xy  log x  log y
x
log  log x  log y
y
log b x (log a x) /(log a b)

a logb x  x logb a 17
Recurrence Relation
Forming Recurrence Relation

Faculty of Information Technology - Computer Science Department 19


Forming Recurrence Relation (Cont..)

Faculty of Information Technology - Computer Science Department 20


Recurrence Examples
Solving Recurrences
 Methods for solving recurrences
 Iteration method
 Substitution method
 Recurrence tree method
 Master method

 We will discuss only master method in this course


The Master Theorem

 Given: a divide and conquer algorithm

 An algorithm that divides the problem of size n into a sub problems,


each of size n/b

 Let the cost of each stage (i.e., the work to divide the problem +
combine solved sub problems) be described by the function f(n)

 Then, the Master Theorem gives us a cookbook for the algorithm’s running
time:
The Master Theorem

 if where a ≥ 1 &
b>1

 Then
Example 1:

Solution: a=4, b=2,


f(n)=n, logba=2, thus n=n2 0<ε <1

so case 1 applies

and T(n) is θ(n2).

ε in case 1 is simply the power that can be subtracted from RHS power (n2)
and f(n) remains O(n2)
Example 2:
(binary
search)

Solution: a=1, b=2,


f(n)=1, logba=0, thus n =n0

so case 2 says T(n) is θ(log n).


Example 3:

Solution: a=1, b=3,


f(n)=n2 log n, logba=0, thus RHS=n0 0< ε < 2

a f(n/b)=f(n/3)
= (n/3)2 log(n/3)
=(1/9) n2 (log(n) – log(3))
<(1/9) n2 log(n)
then c=1/9
so case 3 applies
and T(n) is θ(n2 log n).

ε in case 3 is simply the power that can be added from RHS power and f(n) remains Ω(RHS)
Example 4:

Solution: a=8, b=2,

f(n)=n2.2, log28=3, thus n=n3 0< ε < 0.8

so case 1 applies

and T(n) is θ(n3).


Example 5:

Solution: a=9, b=3,


f(n)=n3, logba=2, thus RHS=n2 0< ε < 1

so case 3 applies

and T(n) is θ(n3).


(note: you should verify the regularity condition). Home Work
Example 6
T(n) = 4T(n/2) + n3

Solution: a = 4 , b = 2
f(n)=n3 , logba=2, thus RHS=n2

so case 3 says T(n) is θ(n3).


(note: you should verify the regularity condition). Home Work

Faculty of Information Technology - Computer Science Department 30


Example 7:
(heap
construction)

Home Work

You might also like