0% found this document useful (0 votes)
60 views

Lec - 1 - Analysis of Algorithms (Part 1)

This document provides an introduction to analyzing algorithms. It defines algorithms and discusses the importance of analyzing them to compare different approaches and determine efficiency. The complexity of an algorithm is defined as a function of the input size. Common rates of growth for algorithm complexity are discussed, including constant, logarithmic, linear, log-linear, polynomial, and exponential. Big O, Big Omega, and Big Theta notations are introduced to classify algorithms by their asymptotic upper bound, lower bound, and tight bound growth rates. Examples are provided to demonstrate how to determine these notations. Properties of asymptotic notations are also summarized.

Uploaded by

Ifat Nix
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views

Lec - 1 - Analysis of Algorithms (Part 1)

This document provides an introduction to analyzing algorithms. It defines algorithms and discusses the importance of analyzing them to compare different approaches and determine efficiency. The complexity of an algorithm is defined as a function of the input size. Common rates of growth for algorithm complexity are discussed, including constant, logarithmic, linear, log-linear, polynomial, and exponential. Big O, Big Omega, and Big Theta notations are introduced to classify algorithms by their asymptotic upper bound, lower bound, and tight bound growth rates. Examples are provided to demonstrate how to determine these notations. Properties of asymptotic notations are also summarized.

Uploaded by

Ifat Nix
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Lecture 1

Analysis of Algorithms
(Part 1)
Algorithms
• Definition:
– A well-defined list of steps for solving a particular problem.

• Objective:
– Develop efficient algorithm for the processing of our data.

• We need to learn the basics of analysing algorithms.

Analysis of Algorithms 3
Why the Analysis of Algorithms?
• To compare different algorithms for performing the same task.

• Based on what factor(s)?

Analysis of Algorithms 4
Complexity of Algorithms
• The complexity of an algorithm is the function f(n) which
gives the running time and/or space requirement of the
algorithm in terms of the size n of the input data.

• Frequently, space complexity = Cn

• Unless otherwise state or implied, the term “complexity”


shall refer to the running time of the algorithm.

Analysis of Algorithms 5
Rate of Growth
• Consider the example of buying Laptop and
Earphone:
Cost = cost_of_laptop + cost_of_earphone
Cost ≈ cost_of_laptop (approximation)

• Example: f(n) = n2 + 4n + 6

Analysis of Algorithms 6
Rate of Growth
𝒏 𝒏𝟐 𝟒𝒏 𝟔 𝒏𝟐 + 𝟒𝒏 + 𝟔
1 1 4 6 11
10 100 40 6 146
20 400 80 6 486
30 900 120 6 1026
40 1600 160 6 1766
50 2500 200 6 2706
60 3600 240 6 3846
70 4900 280 6 5186
80 6400 320 6 6726
90 8100 360 6 8466
100 10000 400 6 10406

Analysis of Algorithms 7
Rate of Growth
• The low order terms in a 𝒏 𝒏𝟐 𝟒𝒏 𝟔 𝒏𝟐 + 𝟒𝒏 + 𝟔
function are relatively 1 1 4 6 11
insignificant for large n 10 100 40 6 146
20 400 80 6 486
• We say that n2 + 4n + 6 and 30 900 120 6 1026
n2 have almost the same 40 1600 160 6 1766
rate of growth. Therefore, 50 2500 200 6 2706
n2 + 4n + 6 = O(n2) 60 3600 240 6 3846
70 4900 280 6 5186
80 6400 320 6 6726
90 8100 360 6 8466
100 10000 400 6 10406

Analysis of Algorithms 8
Rate of Growth
― When we study algorithms, we are interested in characterizing
them according to their efficiency.

― We are usually interested in the order of growth of the


running time of an algorithm, not in the exact running time.
This is also referred to as the asymptotic running time.

― Asymptotic notation gives us a method for classifying functions


f(n) according to their rate of growth.

― The most widely used notation to express this function f(n) is the
Big-O notation. It provides the upper bound for the complexity.

Analysis of Algorithms 9
Analysis of Algorithms 10
Analysis of Algorithms 11
Analysis of Algorithms 12
Analysis of Algorithms 13
Analysis of Algorithms 14
Analysis of Algorithms 15
Analysis of Algorithms 16
Analysis of Algorithms 17
Analysis of Algorithms 18
Analysis of Algorithms 19
Big O Notation
• Big-O notation provides us with an
asymptotic upper bound for the growth rate
of the runtime of an algorithm.

• f(n) = O(g(n)) that is, f of n is Big–O of g


of n if there exist positive constants c and
𝒏𝟎 such that f(n) ≤ cg(n) for all 𝒏 ≥ 𝒏𝟎 .

• It means that for large amounts of data, f(n)


will grow no more than a constant factor than
g(n).

Analysis of Algorithms 20
Big O Notation

f(n) = O(g(n))
• If f(n) ≤ cg(n) for all 𝑛 ≥ 𝑛0
• Where c and 𝑛0 are positive constants

Analysis of Algorithms 21
Big O Notation
• Example: Show that 30n+8 is O(n).

• Here we have 𝑓 𝑛 = 30n+8, and 𝑔 𝑛 = 𝑛


• We need to prove that 𝟑𝟎𝒏 + 𝟖 ≤ 𝒄𝒏 for some constant c.
• Let, c = 31. Then we have
30𝑛 + 8 ≤ 31𝑛 for all 𝑛 ≥ 8.
• Therefore,
30𝑛 + 8 = O 𝑛 with c = 31 and 𝑛0 = 8 (Proved)

Analysis of Algorithms 22
Big O Notation
• Note 30n+8 isn’t less than
n anywhere (n>0). cn =
31n 30n+8

Value of function →
• It isn’t even less than 31n
everywhere.
30n+8
n
• But it is less than 31n = O(n)
everywhere to
the right of n=8. n>n0=8 →
Increasing n →

Analysis of Algorithms 23
Big O Notation
• Example: Is 3𝑛 + 2 = O 𝑛 ?

• Here we have 𝑓 𝑛 = 3𝑛 + 2, and 𝑔 𝑛 = 𝑛


• We need to prove that 𝟑𝒏 + 𝟐 ≤ 𝒄𝒏 for some constant c.

• We notice that when c = 4, we have 3𝑛 + 2 ≤ 4𝑛 for all 𝑛 ≥ 2.


• Therefore,
𝑓 𝑛 = O 𝑔(𝑛)
Or, 3𝑛 + 2 = O 𝑛
• with c = 4 and 𝑛0 = 2

Analysis of Algorithms 24
Big O Notation
• Example: Is 𝑛2 + 𝑛 = 𝑂 𝑛3 ?

• Here we have 𝑓 𝑛 = 𝑛2 + 𝑛, and 𝑔 𝑛 = 𝑛3


• Notice that if 𝑛 ≥ 1, we have 𝑛 ≤ 𝑛3 .
• Also, notice that if 𝑛 ≥ 1, we have n2 ≤ 𝑛3
• Therefore,
𝑛2 + 𝑛 ≤ 𝑛3 + 𝑛3 = 2𝑛3
• We have just shown that
𝑛2 + 𝑛 ≤ 2𝑛3 for all 𝑛 ≥ 1
• Thus, we have shown that 𝑛2 + 𝑛 = 𝑂 𝑛3 with 𝐜 = 𝟐 𝐚𝐧𝐝 𝒏𝟎 = 𝟏

Analysis of Algorithms 25
Big O Notation
Big O visualization
O(g(n)) is the set
of functions
with smaller or
same order of
growth as g(n)

Analysis of Algorithms 26
Big Ω Notation
― The asymptotic lower bound
― The function 𝐟 𝐧 = Ω(𝐠(𝐧)) if and only if there exist positive constants c
and n0 such that f(n) ⩾ c.g(n) for any n > n0 .
― read as “f of n is omega of g of n”

Analysis of Algorithms 27
Big Ω Notation
• Example: 3𝑛 + 2 = Ω 𝑛 ?

• Here we have 𝑓 𝑛 = 3𝑛 + 2, and 𝑔 𝑛 = 𝑛


• Notice that for any 𝑛 ≥ 1, we have 3𝑛 + 2 ≥ 𝑛.

• Therefore,
𝑓 𝑛 = Ω 𝑔(𝑛)
or 3𝑛 + 2 = Ω 𝑛
• with c = 1 and 𝑛0 = 1

Analysis of Algorithms 28
Big θ Notation
― The asymptotic tight bound
― The function 𝐟 𝐧 = 𝛉(𝐠(𝐧)) if and only if there exist positive
constants c1 , c2 and n0 such that c1 . 𝑔 𝑛 ≤ f 𝑛 ≤ c2 . 𝑔(𝑛) for any n > n0 .
― read as “f of n is theta of g of n”

Analysis of Algorithms 29
Big θ Notation
• Example: n2 + 5n + 7 = θ(n2 ) ?

• Here we have 𝑓 𝑛 = n2 + 5n + 7, and 𝑔 𝑛 = n2


• When 𝑛 ≥ 1, we have
n2 + 5n + 7 ≤ n2 + 5n2 + 7n2
≤ 13n2
• Again, when n ≥ 1,
𝑛2 ≤ n2 + 5n + 7
• Thus, 𝑓 𝑛 = θ 𝑔(𝑛) or n2 + 5n + 7 = θ(n2 ) with 𝒏𝟎 = 𝟏, 𝒄𝟏 = 𝟏 and 𝒄𝟐 =
𝟏𝟑

Analysis of Algorithms 30
Big θ Notation
• Subset relations between order-of-growth sets.

R→R
O( f ) ( f )
•f
( f )

Analysis of Algorithms 31
Other Notations (Self-Study)
• Little o Notation
• Little Omega Notation (ω)

Analysis of Algorithms 32
Common Rates of Growth
According to the Big O notation, we have five different categories
of algorithms. Their running time complexities are given as:
Faster algorithms

• Constant : O(1)

• Logarithmic : O(logn)

• Linear : O(n)

• Log-linear : O(n logn)

• Polynomial : O(𝒏𝒌 ) where k > 1

• Exponential : O(𝒌𝒏 ) where k > 1


Slower algorithms

Analysis of Algorithms 33
Common Rates of Growth

Analysis of Algorithms 34
Properties of Asymptotic Notations
General Properties:
• If f(n) is O(g(n)) then a*f(n) is also O(g(n)) ; where a is a constant.

• Example:
f(n) = 2n²+5 is O(n²)
then 7*f(n) = 7(2n²+5)
= 14n²+35 is also O(n²)

• Similarly, this property satisfies both Θ and Ω notation. We can say


If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)); where a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)); where a is a constant.

Analysis of Algorithms 35
Properties of Asymptotic Notations
Reflexive Properties:
• If f(n) is given then f(n) is O(f(n)).

• Example: f(n) = n²
Then f(n) = O(n²) i.e O(f(n))

• Similarly, this property satisfies both Θ and Ω notation. We can say


If f(n) is given then f(n) is Θ(f(n)).
If f(n) is given then f(n) is Ω (f(n)).

Analysis of Algorithms 36
Properties of Asymptotic Notations
Transitive Properties :
• If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) = O(h(n)) .

• Example: if f(n) = n , g(n) = n² and h(n)=n³


n is O(n²) and n² is O(n³) then n is O(n³)

• Similarly this property satisfies for both Θ and Ω notation. We can say
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))

Analysis of Algorithms 37
Properties of Asymptotic Notations
Symmetric Properties :
• If f(n) is Θ(g(n)) then g(n) is Θ(f(n)) .

• Example: f(n) = n² and g(n) = n² then f(n) = Θ(n²) and g(n) =


Θ(n²)

• This property only satisfies for Θ notation.

Transpose Symmetric Properties :


• If f(n) is O(g(n)) then g(n) is Ω (f(n)).

• Example: f(n) = n , g(n) = n² then n is O(n²) and n² is Ω (n)

• This property only satisfies for O and Ω notations.

Analysis of Algorithms 38
Properties of Asymptotic Notations
Some More Properties :
1. If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))
2. If f(n) = O(g(n)) and d(n)=O(e(n))
then f(n) + d(n) = O( max( g(n), e(n) ))

Example: f(n) = n i.e O(n)


d(n) = n² i.e O(n²)
then f(n) + d(n) = n + n² i.e O(n²)

3. If f(n)=O(g(n)) and d(n)=O(e(n))


then f(n) * d(n) = O( g(n) * e(n) )

Example: f(n) = n i.e O(n)


d(n) = n² i.e O(n²)
then f(n) * d(n) = n * n² = n³ i.e O(n³)

Analysis of Algorithms 39
Worst-case, Average-case, Best-case Analysis
• Worst-case running time
✓ This denotes the behaviour of an algorithm with respect to
the worst possible case of the input instance.

✓ The worst-case running time of an algorithm is an upper


bound on the running time for any input.

✓ Therefore, having the knowledge of worst-case running time


gives us an assurance that the algorithm will never go beyond
this time limit.

Analysis of Algorithms 40
Worst-case, Average-case, Best-case Analysis
• Average-case running time
✓ The average-case running time of an algorithm is an estimate
of the running time for an ‘average’ input.

✓ It specifies the expected behaviour of the algorithm when the


input is randomly drawn from a given distribution.

✓ Average-case running time assumes that all inputs of a given


size are equally likely.

Analysis of Algorithms 41
Worst-case, Average-case, Best-case Analysis
• Best-case running time
✓ The term ‘best-case performance’ is used to analyse an algorithm
under optimal conditions.

✓ For example, the best case for a simple linear search on an array
occurs when the desired element is the first in the list.

✓ However, while developing and choosing an algorithm to solve a


problem, we hardly base our decision on the best-case
performance.

✓ It is always recommended to improve the average performance and


the worst-case performance of an algorithm.

Analysis of Algorithms 42
General Rules for (Iterative) Algorithm Analysis
RULE 1- FOR LOOPS:
• The running time of a for loop is at most the running time of the
statements inside the for loop (including tests) times the number of
iterations.

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 43
General Rules for (Iterative) Algorithm Analysis
RULE 2- NESTED FOR LOOPS:
• Analyze these inside out. The total running time of a statement inside a
group of nested for loops is the running time of the statement
multiplied by the product of the sizes of all the for loops.
• As an example, the following program fragment is 𝑂(𝑛2 ):

for( i=0; i<n; i++ )


for( j=0; j<n; j++ )
k++;

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 44
General Rules for (Iterative) Algorithm Analysis
RULE 3- CONSECUTIVE STATEMENTS:
• These just add (which means that the maximum is the one that
counts). As an example, the following program fragment, which has
𝑂(𝑛) work followed by 𝑂(𝑛2 ) work, is also 𝑂 (𝑛2 ):

for( i=0; i<n; i++)


a[i] = 0;
for( i=0; i<n; i++ )
for( j=0; j<n; j++ )
a[i] += a[j] + i + j;

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 45
General Rules for (Iterative) Algorithm Analysis
RULE 4- lF/ELSE:
• For the fragment

if( cond )
S1
else
S2

• the running time of an if/else statement is never more than the


running time of the test plus the larger of the running times of S1 and
S2.

(Taken from Chapter 2 of ‘Data Structures & Algorithm Analysis in C’ by Mark A. Weiss.)

Analysis of Algorithms 46
Related Readings
➢ Introduction to Algorithms (CLRS)
• Chapter 3 (3.1)

➢ Fundamentals of Computer Algorithms (Horowitz, Sahni, Rajasekaran)


• Chapter 1 (1.3.4)

• https://yourbasic.org/algorithms/big-o-notation-explained/

• Asymptotic Notations:
https://drive.google.com/file/d/1PaF_NqJeqykBVT63RcSJRtm_2iUFpz
MB/view?usp=sharing

Analysis of Algorithms 47

You might also like