Lecture 4
Lecture 4
Algorithm
LECTURE 4
INSTRUCTOR: DR . HUSNAIN ASHFAQ
Lecture
Contents
Algorithms Efficiency
Measurement of Efficiency
Time & Space Complexity
Time / Space Tradeoff
Tips for Execution Friendly Development
Execution Time Function
Execution Time Comparison for Different Time
Functions
Efficiency
“Algorithm efficiency is the quality of an algorithm that
describes the computational resources
required to run it”
Computational resources
include:
350
Execution Time
300
Space occupancy
250
Memory Requirements
200
150
100
50
1 2 3 4 5 6 7
n T(n)
Algorithm
Efficiency
Less the resources
350
utilized by
algorithm, an
more efficient it is
300
in opposite
200
directions
150
algorithm
0
1 2 3 4 5 6 7
space
Measuring Algorithm
Efficiency
Two different ways:
1. Measure & Compare Execution Time
2. Measure & Compare Running time (Asymptotic
Analysis)
Note: Running time mostly depends upon input size (n) and is
Measuring Algorithm Efficiency… Measuring
Execution Time
For same algorithm, not all inputs take same time to execute
There are input values for which execution time is least (best
cases).
Examples:
incase of sorting data is already sorted
You are looking for a key (linear search) and first element is your
required key
You are looking for a key (binary search) and middle value is your
required key
Best, Average & Worst
Cases
No growth at all
The runtime does not grow at all as a function of n (constant)
Basically, it is any operation that does not depend on the value
of n to do its job
Has the slowest growth pattern (none!)
Examples:
1. Accessing an element of array
2. Accessing maximum value from a
MAX HEAP
3. Accessing header node from a link
list
4. Accessing root node from a tree
5. Hashing
Asymptotic Analysis … Logarithmic Growth
O(log n)
Logarithmic Growth
The runtime growth is proportional to the base 2
logarithm (log) of n
Examples:
1. Binary Search
2. Max/Min value from a complete
binary tree
Asymptotic Analysis … Linear
Growth O(n)
Linear Growth
Runtime grows proportional to the
value of n
Examples:
1. Linear Search
2. Max/Min value from an
array
3. Sum of value from an
array
4. Link list traversal
Asymptotic Analysis … O(n
log n)
(n log n) Growth
Any sorting algorithm that uses comparisons between
elements is O(n log n), based on divide an conquer
approach
Examples:
1. Merge
Sort
2. Quick
Sort
Asymptotic Analysis …
O(n2)
(n2) Growth
Running Time grows
rapidly
Slow sorting
algorithms Examples:
1. Bubble Sort
2. Insertion Sort
3. Selection Sort
4. Quick Sort (Worst
Case)
Asymptotic Analysis … Polynomial
Growth O(nk)
(nk) Growth
Running Time grows
rapidly
Suitable for small n
Examples:
1. Matrix multiplication
2. Maximum matching for
bipartite graph
3. Multiplying n-digit numbers by
simple algorithm
Asymptotic Analysis … Polynomial
Growth O(2k)
(2k) Growth
Running Time grows extremely
rapidly
Suitable only for very small n
Examples:
1. Exact solution for travelling
salesman
problem
2. Brute force search problems
Asymptotic Analysis … Graph for n
(1-15)
log (n) vs n
16
14
12
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
log (n) n
Asymptotic Analysis … Graph for n
(1-15)
n vs n log (n)
70
60
50
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14
15
n
n log (n)
Asymptotic Analysis … Graph for n
(1-15)
n log (n) vs n^2 vs n^3
4000
3500
3000
2500
2000
1500
1000
500
0
1 2 3 4 5 6 8 9 10 11 12 13 14 15
7
n^2 n^3
n log (n)
Asymptotic Analysis … Graph for n
(1-15)
n^3 vs 2^n
35000
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n^3 2^n
Asymptotic Analysis … Graph for n
(1-100)
log n – n – n log n – n2 – n3-2n
35000
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Constant
log n
n
n log n
n2
n3
2n
nn
When we have constant time the its complexity is big O of 1 or order of 1: O(1)
X= 15- (35/2); O(1)
Cout<< x; O(1)
y= 5*9; O(1)
Cout<< y; O(1)
Total = O(1) + O(1) +O(1) + O(1) = 4O(1)
Implement rule 2: ignore constant then we left with O(1), means to implement all these statement the total time will be O(1).
Order of 1 is also term as Constant
For (i=0; i<n; i++){
Cout<< I; O(1)
}
We need to find how many times the cout statement will be executed
Lets assume N = 5
Total time= n*O(1) = n*K or k*n (here k is constant)
Implement rule 2: ignore constant then we left with n is our complexity will be O(n),
Suppose N= 5
Practice Problems