0% found this document useful (0 votes)
4 views

1 TimeComplexity

Uploaded by

Tashrif Gulzer
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

1 TimeComplexity

Uploaded by

Tashrif Gulzer
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 74

Algorithms

Lecture 1
Time Complexity & Asymptotic Analysis
Prantik Paul [PNP]
Lecturer
Department of Computer Science and
Engineering
1
BRAC University
Algorithm Definition
● A finite set of statements that guarantees an optimal
solution in finite interval of time

● Algorithmic thinking and problem solving skill are vital in


making efficient solutions.

● The English word "ALGORITHM" derives from the Latin word


AL- AL-KHWARIZMI’S name. He developed the concept of an
algorithm in Mathematics, and thus sometimes being called
the “Grandfather of Computer Science".
2
Glance of Algorithm

● An algorithm is a finite set of instructions or logic,


written in order, to accomplish a certain predefined
task.
● Algorithm is not the complete code or program
● Can be expressed either as an informal high level
description as pseudocode or using a flowchart.

3
Glance of Algorithm

4
Algorithm Specifications
● Input - Every Algorithm must take zero or more number of
input values from external.
● Output - Every Algorithm must produce an output as result.
● Definiteness - Every statement/instruction in an algorithm
must be clear and unambiguous (only one interpretation)
● Finiteness - For all different cases, the algorithm must
produce result within a finite number of steps.
● Effectiveness - Every Instruction must be basic enough to be
carried out and it also must be feasible.

5
Good Algorithms?
● Run in less time

● Consume less memory

But computational resources (time complexity) usually


important

6
Analyzing Algorithms
● Predict the amount of resources required:
• memory: how much space is needed?
• computational time: how fast the algorithm runs?

● FACT: running time grows with the size of the input


● Input size (number of elements in the input)
○ Size of an array, polynomial degree, # of elements in a matrix, # of
bits in the binary representation of the input, vertices and edges in a
graph

Def: Running time = the number of primitive operations


(steps) executed before termination 7
Algorithm Analysis: Example
● Alg.: MIN (a[1], …, a[n])
m ← a[1];
for i ← 2 to n
if a[i] < m
then m ← a[i];
● Running time:
○ the number of primitive operations (steps) executed before termination
T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] + (n-1) [the
assignment in then] = 3n - 1
● Order (rate) of growth:
○ The leading term of the formula
○ Expresses the asymptotic behavior of the algorithm
8
Typical Running Time Functions
● 1 (constant running time):
○ Instructions are executed once or a few times

● logN (logarithmic)
○ A big problem is solved by cutting the original problem in smaller sizes, by a constant
fraction at each step

● N (linear)
○ A small amount of processing is done on each input element

● N logN
○ A problem is solved by dividing it into smaller problems, solving them independently and
combining the solution

9
Typical Running Time Functions
● N2 (quadratic)
○ Typical for algorithms that process all pairs of data items (double nested
loops)

● N3 (cubic)
○ Processing of triples of data (triple nested loops)

● NK (polynomial)

● 2N (exponential)
○ Few exponential algorithms are appropriate for practical use

10
Growth of Functions

11
Complexity Graphs

log(n)

12
Complexity Graphs

n log(n)

log(n)

13
Complexity Graphs

n10 n3

n2
n log(n)
14
Complexity Graphs (log scale)

3n
nn
n20

2n

n10

1.1n
15
How do we compare algorithms?
• We need to define a number of objective measures.

(1) Compare execution times?


Not good: times are specific to a particular
computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with
the programming language as well as the style of
the individual programmer.
16
Ideal Solution

• Express running time as a function of the


input size n (i.e., f(n)).
• Compare different functions corresponding to
running times.
• Such an analysis is independent of machine
time, programming style, etc.

17
Example
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number
of times each statement is executed.
Algorithm 1 Algorithm 2

Cost
Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0;
c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------- ------------- 18
Another Example

• Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2

19
Best, Worst, and Average Case Complexity

Number of Worst Case


steps Complexity

Average Case
Complexity

Best Case
Complexity

N (input size)
20
Algorithm Complexity
● Worst Case Complexity:
○ the function defined by the maximum number of steps
taken on any instance of size n
● Best Case Complexity:
○ the function defined by the minimum number of steps
taken on any instance of size n
● Average Case Complexity:
○ the function defined by the average number of steps
taken on any instance of size n
21
Doing the Analysis
● It’s hard to estimate the running time exactly
○ Best case depends on the input
○ Average case is difficult to compute
○ So we usually focus on worst case analysis
■ Easier to compute
■ Usually close to the actual running time
● Strategy: find a function (an equation) that, for large n, is an upper bound
to the actual function (actual number of steps, memory usage, etc.)

Upper bound
Actual function
Lower bound

22
Asymptotic Growth
When trying to decide whether an algorithm is efficient
we are only interested in the value of its time
complexity for large values of n, because for small
values of n the running time of an algorithm is very
small. (For example, in the previous table, for values of
n smaller than 100 the running times are much smaller
than 1 second. However, for n = 1 million, the running
time of the algorithm is 173 days.)

23
Motivation for Asymptotic Analysis
● An exact computation of worst-case running time
can be difficult
○ Function may have many terms:
■ 4n2 - 3n log n + 17.5 n - 43 n⅔ + 75
● An exact computation of worst-case running time
is unnecessary

24
Asymptotic Analysis
• To compare two algorithms with running times f(n) and
g(n), we need a rough measure that characterizes how
fast each function grows.

• Hint: use rate of growth

• Compare functions in the limit, that is, asymptotically!


(i.e., for large values of n)

25
Rate of Growth
• Consider the example of buying a car and a
packet of biscuits:
Cost: cost_of_car + cost_of_biscuit
Cost ~ cost_of_car (approximation)

• The low order terms in a function are relatively


insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate
of growth
26
Classifying functions by their
Asymptotic Growth Rates

1. O(g(n)), Big-Oh of g of n, the Asymptotic Upper Bound


2. Θ(g(n)), Theta of g of n, the Asymptotic Tight Bound
3. Ω(g(n)), Omega of g of n, the Asymptotic Lower Bound

27
Asymptotic Notation

• O notation: asymptotic “less than”:


→ f(n)=O(g(n)) implies: f(n) “≤” g(n)

• Ω notation: asymptotic “greater than”:

→ f(n)= Ω (g(n)) implies: f(n) “≥” g(n)

• Θ notation: asymptotic “equality”:

→ f(n)= Θ (g(n)) implies: f(n) “=” g(n)

28
Big-O Notation

• We say fA(n)=30n+8 is order n, or O (n)


It is, at most, roughly proportional to n.
• fB(n)=n2+1 is order n2, or O(n2). It is, at most,
roughly proportional to n2.
• In general, any O(n2) function is faster- growing
than any O(n) function.

29
Visualizing Orders of Growth

• On a graph, as
you go to the
right, a faster
growing fA(n)=30n+8

function →
Value of
function
eventually
becomes
larger... fB(n)=n2+1

Increasing n
→ 30
More Examples …

• n4 + 100n2 + 10n + 50 is O(n4)


• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
– 10 is O(1)
– 1273 is O(1)

31
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

• Both algorithms are of the same order: O(N)

32
Example (cont’d)

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
33
Example #1: carry n books
from one bookshelf to another one

• How many operations?


• n pick-ups, n forward moves, n drops and n
reverse moves 🡪 4 n operations
• 4n operations = c. n = O(c. n) = O(n)
• Similarly, any program that reads n inputs from
the user will have minimum time complexity O(n).

34
Example #2: Locating Roll-Number record in
Attendance Sheet
What is the time complexity of search?
• Binary Search algorithm at work
– O(log n)
• Sequential search?
– O(n)

35
Example #3: Teacher of CSE 221
gives gifts to first 10 students
• There are n students in the queue.
• Teacher brings one gift at a time.
• Time complexity = O(c. 10) = O(1)
• Teacher will take exactly same time irrespective of the line length.

36
Loops with Break

for (j = 0; j < n; ++j)


{
// 3 atomics
if (condition) break;
}

• Upper bound = O(4n) = O(n)


• Lower bound = Ω(4) = Ω(1)
• Complexity = O(n)

37
Sequential Search

• Given an unsorted vector/list a[ ], find the location of


element X.

for (i = 0; i < n; i++) {


if (a[i] == X) return true;
}
return false;

• Input size: n = array size()


• Complexity = O(n)

38
If-then-else Statement

if(condition)
i = 0;
else
for ( j = 0; j < n; j++)
a[j] = j;

• Complexity = ??
= O(1) + max ( O(1), O(N))
= O(1) + O(N)
= O(N) 39
Consecutive Statements

for (j = 0; j < n; ++j) {


// 3 atomics
}
for (j = 0; j < n; ++j) {
// 5 atomics
}

• Add the complexity of consecutive statements


• Complexity = O(3n + 5n) = O(n)

40
Nested Loop Statements
• Analyze such statements inside out

for (j = 0; j < n; ++j) {


// 2 atomics
for (k = 0; k < n; ++k) {
// 3 atomics
}
}
• Complexity = (2 + 3n)n = O(n^2)

41
Example

• Code:
a = b;

• Complexity:

O(1)

42
Example
Code:
sum = 0;
for (i=1; i <=n; i++)
sum += n;

• Complexity:

O(n)

43
Example
• Code:
sum1 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum1++;
• Complexity:

O(n^2)

44
Example
• Code:
sum2 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum2++;
• Complexity:

O(n^2)

45
Example
• Code:
sum = 0;
for (j=1; j<=n; j++)
for (i=1; i<=j; i++)
sum++;
for (k=0; k<n; k++)
A[k] = k;
• Complexity:

O(n^2)
46
Example
• Code:
sum1 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=n; j++)
sum1++;
• Complexity:

O(nlogn)

47
Example
• Code:
sum2 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=k; j++)
sum2++;
• Complexity:

O(n)

48
Classifying functions by their
Asymptotic Growth Rates

1. O(g(n)), Big-Oh of g of n, the Asymptotic Upper


Bound
2. Θ(g(n)), Theta of g of n, the Asymptotic Tight
Bound
3. Ω(g(n)), Omega of g of n, the Asymptotic Lower
Bound

49
Asymptotic notations
• O-notation

50
Big-O

● What does it mean?


○ If f(n) = O(n2), then:
■ f(n) can be larger than n2 sometimes, but…
■ We can choose some constant c and some value n0 such
that for every value of n larger than n0 : f(n) < cn2
■ That is, for values larger than n0, f(n) is never more than
a constant multiplier greater than n2
■ Or, in other words, f(n) does not grow more than a
constant factor faster than n2.

51
Examples

– 2n = O(n ):
2 3 2n 2
≤ cn 3
⇒ 2 ≤ cn ⇒ c = 1 and
n 0= 2
– n2 = O(n2): n2 ≤ cn2 ⇒ c ≥ 1 ⇒ c = 1 and

0n=1
– 1000n2+1000n = O(n2):

1000n2+1000n ≤ 1000n 2
+ n 2
=1001n 2
⇒ c=1001 and
– n = O(n2):
n0 = 1000
n ≤ cn2 ⇒ cn ≥ 1 ⇒ c = 1 and
n 0= 1 52
More Big-O

● Prove that:
● Let c = 21 and n0 = 4
● 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE

53
Examples
• Show that

• Let c = 2 and n0 = 5

54
More Examples
• Show that 30n+8 is O(n).
– Show ∃c,n0: 30n+8 ≤ cn, ∀n>n0 .
• Let c=31, n0=8. Assume n>n0=8. Then
cn = 31n = 30n + n > 30n+8, so 30n+8 <
cn.

55
Big-O example, graphically
• Note 30n+8 isn’t
less than n
cn =
anywhere (n>0).
31n 30n+8
• It isn’t even

function →
Value of
less than 31n
everywhere. n
30n+8
• But it is less than ∈O(n)
56
31n everywhere to n>n0=8
the right of n=8. →
Increasing n

No Uniqueness

• There is no unique set of values for n0 and c in proving the


asymptotic bounds
• Prove that 100n + 5 = O(n2)
– 100n + 5 ≤ 100n + n = 101n ≤ 101n2

for all n ≥ 5

n0 = 5 and c = 101 is a solution

– 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2


for all n ≥ 1

n0 = 1 and c = 105 is also a solution


57
Big-O

58
Asymptotic notations
• Ω - notation

Ω(g(n)) is the set of functions


with larger or same order of
growth as g(n)

59
Examples
– 5n2 = Ω(n)

∃ c, n0 such that: 0 ≤ cn ≤
⇒ cn ≤ ⇒ c = 5 and n0
– 5n
100n
2 + 5 ≠ Ω(n2) 5n2 =1
∃ c, n0 such that: 0 ≤ cn2 ≤
100n + 5 ≤ 100n + 5n (∀ n ≥ 1)
=
cn105n
2
≤ ⇒ n(cn – 105) ≤
105n
Since n is0positive ⇒ cn – 105 ⇒ n ≤
– 2n⇒
≤ 0 n3 = Ω(n2),nn cannot
= contradiction:
Ω(n), be smaller105/c
= Ω(logn) than a
constant
60
Asymptotic notations (cont.)
• Θ-notation

Θ(g(n)) is the set of functions


with the same order of growth
as g(n)

61
Examples

– n2/2 –n/2 = Θ(n2)

• ½ n2 - ½ n ≤ ½ n2 ∀n ≥ 0 ⇒ c2= ½

• ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( ∀n ≥ 2 ) = ¼ n2

• ⇒ c1= ¼

– n ≠ Θ(n2):

c1 n2 ≤ n ≤ c2 n2

⇒ only holds for: n ≤ 1/c 1 62


Examples
● Prove that:

● Let c1 = 21 and n0 = 10
● 21n3 > 20n3 + 7n + 1000 for all n > 10
n3 > 7n + 1000 for all n > 10
TRUE, but we also need…
● Let c2 = 20 and n0 = 1
● 20n3 < 20n3 + 7n + 1000 for all n ≥ 1
TRUE
63
Relations Between Different Sets
• Subset relations between order-of-growth sets.

O( f ) R→R Ω( f )

•f
Θ( f )

64
Logarithms and properties

• In algorithm analysis we often use the notation


“log n” without specifying the base
Binary logarithm
Natural logarithm

65
Simplifying Assumptions

1. If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n))

2. If f(n) = O(kg(n)) for any k > 0, then f(n) = O(g(n))

3. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),


then f1(n) + f2(n) = O(max (g1(n), g2(n)))

4. If f1(n) = O(g1(n)) and f2(n) = O(g2(n)),


then f1(n) * f2(n) = O(g1(n) * g2(n))
66
Some Simplified Rules
● O(1) = c , where c is a constant
● O(n) = c*n = cn , where c is constant and n is variable
● c1*O(1) = c1*c = c2 = O(1) , where c,c1,c2 are constants
○ O(1) + O(1) + O(1) = 3*O(1) = O(1)
○ 5*O(1) = O(1)
● n*O(1) = n*c = cn = O(n) , where c is constant and n is variable
● O(m) + O(n) ≠ O(m+n)
● O(m) * O(n) = c1mc2n = (c1*c2)(mn) = (c2)(mn) = O(mn)
● O(m)*O(n)*O(p)*O(q) = O(m(n(p(q)))) = O(mnpq)
○ Example nested for loops
● O(an2 + bn + c) = O(n2) where a, b , c are constants
67
Common Summations

• Arithmetic series:

• Geometric series:

– Special case: |x| < 1:

• Harmonic series:

• Other important formulas:


68
More Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which
relationship is correct.

– f(n) = log n2; g(n) = log n + 5


f(n) = Θ (g(n))
– f(n) = n; g(n) = log n2 f(n) = Ω(g(n))
– f(n) = log log n; g(n) = log n
f(n) = O(g(n))
– f(n) = n; g(n) = log n2

f(n) = Ω(g(n))

69
More Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which
relationship is correct.

f(n) = Ω(g(n))
– f(n) = n log n + n; g(n) = log n
– f(n) = 10; g(n) = log 10 f(n) = Θ(g(n))
– f(n) = 2n; g(n) = 10n2 f(n) = Ω(g(n))
– f(n) = 2n; g(n) = 3n
f(n) = O(g(n))

70
Asymptotic Notations - Examples

• O notation

– 2n vs. n
2 3 2n2 = O(n3)
– n vs. n
2 2 n2 = O(n2)
– n3 vs. nlogn n3 ≠ O(nlogn)

71
Asymptotic Notations - Examples

• Θ notation
– n2/2 – n/2 = Θ(n2)
– (6n3 + 1)lgn/(n + 1)
– n vs. n2 = Θ(n2lgn)

n ≠ Θ(n2)

72
Asymptotic Notations - Examples

• Ω notation
– n3 vs. n2
n3 = Ω(n2)
– n vs. logn
– n vs. n2
n = Ω(logn)

n ≠ Ω(n2)

73
Summary
● Time complexity is a measure of algorithm efficiency
● Efficient algorithm plays the major role in determining the
running time.
Q: Is it possible to determine running time based on
algorithm’s time complexity alone?
● Minor tweaks in the code can cut down the running time by a
factor too.
● Other items like CPU speed, memory speed, device I/O speed
can help as well.
● For certain problems, it is possible to allocate additional space
74
& improve time complexity.

You might also like