0% found this document useful (0 votes)
5 views

1.-Introduction-to-Asymptotic-analysis

The document provides an introduction to asymptotic analysis, focusing on the runtime of algorithms and how to classify the growth of functions. It discusses key concepts such as algorithms, programs, complexity analysis, and various notations like Big-O, Big-Omega, and Big-Theta. The goal is to understand how to analyze and compare algorithms based on their efficiency and resource requirements as input sizes increase.

Uploaded by

shahjahanseraj77
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

1.-Introduction-to-Asymptotic-analysis

The document provides an introduction to asymptotic analysis, focusing on the runtime of algorithms and how to classify the growth of functions. It discusses key concepts such as algorithms, programs, complexity analysis, and various notations like Big-O, Big-Omega, and Big-Theta. The goal is to understand how to analyze and compare algorithms based on their efficiency and resource requirements as input sizes increase.

Uploaded by

shahjahanseraj77
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 38

Introduction

to
Asymptotic Analysis
CSE 2218 1

UMAMA RAHMAN
Today’s Goals
• Discuss Runtime of Programs.
• Compute and Classify growth of functions.
• Analyze complexity classes for algorithms.

2
DSA
 Textbook: Introduction to Algorithms

Cormen, Leiserson, Rivest, and Stein

3
What is an Algorithm?
 An algorithm is a sequence of computational steps that solves a

well-specified computational problem.


• An algorithm is said to be correct if, for every input

instance, it halts with the correct output


• An incorrect algorithm might not halt at all on some input

instances, or it might halt with other than the desired


output.

4
What is a Program?

 A program is the expression of an algorithm in a programming language

 A set of instructions which the computer will follow to solve a


problem

5
Define a Problem, and Solve It
 Problem:
 Description of Input-Output relationship

 Algorithm:
 A sequence of computational steps that transform the input into
the output.
 Data Structure:
 An organized method of storing and retrieving data.

 Our Task:
 Given a problem, design a correct and good algorithm that solves
it.

6
Define a Problem, and Solve It
 Problem: Input is a sequence of integers stored in an array.
Output the minimum.

Algorithm
INPUT

instance m:= a[1]; OUTPUT


for i:=2 to size of input
25, 90, 53, 23, 11, 34 if m > a[i] then 11
m:=a[i];
return m

m, a[i] Data-Structure

7
What do we Analyze?
o Correctness
o Does the input/output relation match algorithm requirement?

o Amount of work done (complexity)


o Basic operations to do task

o Amount of space used


o Memory used

o Simplicity, clarity
o Verification and implementation.

o Optimality
o Is it impossible to do better?

8
Running Time
 Number of primitive steps that are executed
 Except for time of executing a function call most statements roughly
require the same amount of time
 y = m * x + b
 c = 5 / 9 * (t - 32 )
 z = f(x) + g(y)

 We can be more exact if need to be

9
Asymptotic Performance

 We care most about asymptotic performance


• How does the algorithm behave as the problem size gets very
large?
o Running time
o Memory/storage requirements
o Bandwidth/power requirements/logic gates/etc.

10
Asymptotic Analysis
 Worst case
o Provides an upper bound on running time
o An absolute guarantee of required resources

 Average case
o Provides the expected running time
o Very useful, but treat with care: what is “average”?
o Random (equally likely) inputs
o Real-life inputs

 Best case
o Provides a lower bound on running time

Lower Bound Running Time Upper Bound 11


Upper Bound Notation
 We say Insertion Sort's run time is O(n2)
 Properly we should say run time is in O(n2)
 Read O as “Big-Oh” (you’ll also hear it as “order”)

 In general a function
 f(n) is O(g(n)) if there exist positive constants c and n
0
such that 0  f(n)  c  g(n) for all n  n0
 Formally
 O(g(n)) = { f(n):  positive constants c and n such that
0

0  f(n)  c  g(n)  n  n0 }

12
Upper Bound Notation
time
c.g(n)

f(n)

n0 n

We say g(n) is an asymptotic upper bound for f(n)

13
Lower Bound Notation
 We say InsertionSort’s run time is (n) (big-omega or just omega)
 In general a function
 f(n) is (g(n)) if  positive constants c and n such that
0
 0  cg(n)  f(n)  n  n0
 Proof:
 Suppose run time is an + b
 Assume a and b are positive
 an  an + b

14
Lower Bound Notation
time
f(n)

c.g(n)

n0 n

We say g(n) is an asymptotic lower bound for f(n)

15
Asymptotic Tight Bound
 A function f(n) is (g(n)) if  positive constants c , c , and n
1 2 0
such that

0  c1 g(n)  f(n)  c2 g(n)  n  n0

 Theorem
 f(n) is (g(n)) iff f(n) is both O(g(n)) and (g(n))

16
Asymptotic Tight Bound
c2.g(n)
f(n)
time

c1.g(n
)

n0 n

We say g(n) is an asymptotic tight bound for f(n)


17
Asymptotic Notation
• O notation: asymptotic “less than”:

f(n)=O(g(n)) implies: f(n) “≤” g(n)

•  notation: asymptotic “greater than”:

f(n)=  (g(n)) implies: f(n) “≥” g(n)

•  notation: asymptotic “equality”:

f(n)=  (g(n)) implies: f(n) “=” g(n)

18
Practical Complexity
Function Descriptor Big-Oh
c Constant O( 1 )
log n Logarithmic O( log n )
n Linear O( n )
n log n n log n O( n log n )
n2 Quadratic O( n2 )
n3 Cubic O( n3 )
nk Polynomial O( nk )
2n Exponential O( 2n )
n! Factorial O( n! )
Practical Complexity
For large input sizes, constant terms are insignificant
Program A with running time TA(n)= 100n
Program B with running time TB(n)= 2n2

TP(n)

TB(n) = 2n2

TA(n) = 100n
5000

Input Size n
50

20
Analysis of Algorithms
 What is the goal of analysis of algorithms?
 To compare algorithms mainly in terms of running time but also in
terms of other factors (e.g., memory requirements, programmer’s
effort etc.)
 What do we mean by running time analysis?
 Determine how running time increases as the size of the problem
increases.
Input Size
 Input size (number of elements in the input)

 size of an array

 # of elements in a matrix

 # of bits in the binary representation of the input

 vertices and edges in a graph


How do we compare algorithms?
(1) Compare execution times?
Not good: times are specific to a particular computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with the programming
language as well as the style of the individual programmer.
Ideal Solution

 Express running time as a function of the input size n (i.e.,


f(n)).
 Compare different functions corresponding to running times.
 Such an analysis is independent of machine time, programming style,
etc.
Rate of Growth
 The low order terms in a function are relatively insignificant
for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same


rate of growth
Big-O Notation
 We say fA(n)=30n+8 is order n, or O(n)

It is, at most, roughly proportional to n.

 fB(n)=n2+1 is order n2, or O(n2). It is, at most, roughly

proportional to n2.

 In general, any O(n2) function is faster- growing than any O(n)

function.
Example
 Associate a "cost" with each statement.
 Find the "total cost“ by finding the total number of times each
statement is executed.

Algorithm 1 Algorithm 2

Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2

Both algorithms are of the same order: O(N)


Another Example
 Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
Big-O Visualization
O(g(n)) is the
set of functions
with smaller or
same order of
growth as g(n)

29
More Examples …
 n4 + 100n2 + 10n + 50 is O(n4)
 10n3 + 2n2 is O(n3)
 n3 - n2 is O(n3)
 2n2 = O(n3)
 n2 = O(n2)
 1000n2+1000n = O(n2)
 n = O(n2)
 constants
 10 is O(1)
 1273 is O(1)
Examples of big-omega
 5n2 = (n)
 c, n0 such that: 0  cn  5n2  cn  5n  c = 5 and n0 = 1
2

 100n + 5 ≠ (n2)

 n = (n), n3 = (n2), n = (logn)


Relations Between Different Sets

 Subset relations between order-of-growth sets.

RR
O(f) (f)

(f)

32
Examples of theta notation
 f(n) = n2/2 – n/2 = (n2)

 We can express f(n) = O(n2) and f(n) = (n2)

 f(n) = 6n3 ≠ (n2)

 6n3 ≠ O(n2) even though 6n3 = (n2)

 logn ≠ (n)

 logn = O(n) but logn ≠ (n)


Logarithms and properties
 In algorithm analysis we often use the notation “log n” without specifying the

base

Binary logarithm lg n log2 n log x y  y log x


Natural logarithm ln n loge n log xy  log x  log y
x
lg k n (lg n ) k log  log x  log y
y
lg lg n lg(lg n ) log a
a logb x  x b

log b x  log a x
log a b

1<𝑙𝑜𝑔𝑛< √ 𝑛<𝑛<𝑛𝑙𝑜𝑔𝑛<𝑛 <𝑛 𝑙𝑜𝑔𝑛<𝑛 <…<2 <3 <…<𝑛


2 2 3 𝑛 𝑛 𝑛
34
More Examples
 For each of the following pairs of functions, either f(n) is O(g(n)), f(n)
is Ω(g(n)), or f(n) = Θ(g(n)). Determine which relationship is correct.

 f(n) = log n2; g(n) = log n + 5 f(n) =  (g(n))

 f(n) = n; g(n) = log n2 f(n) = (g(n))


 f(n) = log log n; g(n) = log n f(n) = O(g(n))
 f(n) = n; g(n) = log2 n f(n) = (g(n))
 f(n) = n log n + n; g(n) = log n f(n) = (g(n))

 f(n) = 10; g(n) = log 10 f(n) = (g(n))

 f(n) = 2n; g(n) = 10n2 f(n) = (g(n))

 f(n) = 2n; g(n) = 3n f(n) = O(g(n))


Common Summations
n
n( n  1)
 Arithmetic series:  k 1  2  ...  n 
k 1 2
n
x n 1  1
 k 2
x 1  x  x  ...  x n
x 1
 Geometric series: k 0 x 1

1
 Special case: |x| < 1: x
k 0
k

1 x
n
1 1 1

k 1 k
1 
2
 ... 
n
ln n
 Harmonic series:
n

 lg k n lg n
 Other important formulas: k 1

n
1
 k p 1p  2 p  ...  n p 
k 1 p 1
n p 1

36
Other Asymptotic Notations
 A function f(n) is o(g(n)) if  positive constants c and n such
0
that
f(n) < c g(n)  n  n0

 A function f(n) is (g(n)) if  positive constants c and n such


0
that
c g(n) < f(n)  n  n0
 Intuitively,

■ o( ) is like < ■ ( ) is like > ■ ( ) is like =


■ O( ) is like  ■ ( ) is like 

38
Thank You

39

You might also like