1 TimeComplexity
1 TimeComplexity
Lecture 1
Time Complexity & Asymptotic Analysis
Prantik Paul [PNP]
Lecturer
Department of Computer Science and
Engineering
1
BRAC University
Algorithm Definition
● A finite set of statements that guarantees an optimal
solution in finite interval of time
3
Glance of Algorithm
4
Algorithm Specifications
● Input - Every Algorithm must take zero or more number of
input values from external.
● Output - Every Algorithm must produce an output as result.
● Definiteness - Every statement/instruction in an algorithm
must be clear and unambiguous (only one interpretation)
● Finiteness - For all different cases, the algorithm must
produce result within a finite number of steps.
● Effectiveness - Every Instruction must be basic enough to be
carried out and it also must be feasible.
5
Good Algorithms?
● Run in less time
6
Analyzing Algorithms
● Predict the amount of resources required:
• memory: how much space is needed?
• computational time: how fast the algorithm runs?
● logN (logarithmic)
○ A big problem is solved by cutting the original problem in smaller sizes, by a constant
fraction at each step
● N (linear)
○ A small amount of processing is done on each input element
● N logN
○ A problem is solved by dividing it into smaller problems, solving them independently and
combining the solution
9
Typical Running Time Functions
● N2 (quadratic)
○ Typical for algorithms that process all pairs of data items (double nested
loops)
● N3 (cubic)
○ Processing of triples of data (triple nested loops)
● NK (polynomial)
● 2N (exponential)
○ Few exponential algorithms are appropriate for practical use
10
Growth of Functions
11
Complexity Graphs
log(n)
12
Complexity Graphs
n log(n)
log(n)
13
Complexity Graphs
n10 n3
n2
n log(n)
14
Complexity Graphs (log scale)
3n
nn
n20
2n
n10
1.1n
15
How do we compare algorithms?
• We need to define a number of objective measures.
17
Example
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number
of times each statement is executed.
Algorithm 1 Algorithm 2
Cost
Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0;
c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------- ------------- 18
Another Example
• Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2
19
Best, Worst, and Average Case Complexity
Average Case
Complexity
Best Case
Complexity
N (input size)
20
Algorithm Complexity
● Worst Case Complexity:
○ the function defined by the maximum number of steps
taken on any instance of size n
● Best Case Complexity:
○ the function defined by the minimum number of steps
taken on any instance of size n
● Average Case Complexity:
○ the function defined by the average number of steps
taken on any instance of size n
21
Doing the Analysis
● It’s hard to estimate the running time exactly
○ Best case depends on the input
○ Average case is difficult to compute
○ So we usually focus on worst case analysis
■ Easier to compute
■ Usually close to the actual running time
● Strategy: find a function (an equation) that, for large n, is an upper bound
to the actual function (actual number of steps, memory usage, etc.)
Upper bound
Actual function
Lower bound
22
Asymptotic Growth
When trying to decide whether an algorithm is efficient
we are only interested in the value of its time
complexity for large values of n, because for small
values of n the running time of an algorithm is very
small. (For example, in the previous table, for values of
n smaller than 100 the running times are much smaller
than 1 second. However, for n = 1 million, the running
time of the algorithm is 173 days.)
23
Motivation for Asymptotic Analysis
● An exact computation of worst-case running time
can be difficult
○ Function may have many terms:
■ 4n2 - 3n log n + 17.5 n - 43 n⅔ + 75
● An exact computation of worst-case running time
is unnecessary
24
Asymptotic Analysis
• To compare two algorithms with running times f(n) and
g(n), we need a rough measure that characterizes how
fast each function grows.
25
Rate of Growth
• Consider the example of buying a car and a
packet of biscuits:
Cost: cost_of_car + cost_of_biscuit
Cost ~ cost_of_car (approximation)
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate
of growth
26
Classifying functions by their
Asymptotic Growth Rates
27
Asymptotic Notation
28
Big-O Notation
29
Visualizing Orders of Growth
• On a graph, as
you go to the
right, a faster
growing fA(n)=30n+8
function →
Value of
function
eventually
becomes
larger... fB(n)=n2+1
Increasing n
→ 30
More Examples …
31
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
32
Example (cont’d)
Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)
33
Example #1: carry n books
from one bookshelf to another one
34
Example #2: Locating Roll-Number record in
Attendance Sheet
What is the time complexity of search?
• Binary Search algorithm at work
– O(log n)
• Sequential search?
– O(n)
35
Example #3: Teacher of CSE 221
gives gifts to first 10 students
• There are n students in the queue.
• Teacher brings one gift at a time.
• Time complexity = O(c. 10) = O(1)
• Teacher will take exactly same time irrespective of the line length.
36
Loops with Break
37
Sequential Search
38
If-then-else Statement
if(condition)
i = 0;
else
for ( j = 0; j < n; j++)
a[j] = j;
• Complexity = ??
= O(1) + max ( O(1), O(N))
= O(1) + O(N)
= O(N) 39
Consecutive Statements
40
Nested Loop Statements
• Analyze such statements inside out
41
Example
• Code:
a = b;
• Complexity:
O(1)
42
Example
Code:
sum = 0;
for (i=1; i <=n; i++)
sum += n;
• Complexity:
O(n)
43
Example
• Code:
sum1 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum1++;
• Complexity:
O(n^2)
44
Example
• Code:
sum2 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum2++;
• Complexity:
O(n^2)
45
Example
• Code:
sum = 0;
for (j=1; j<=n; j++)
for (i=1; i<=j; i++)
sum++;
for (k=0; k<n; k++)
A[k] = k;
• Complexity:
O(n^2)
46
Example
• Code:
sum1 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=n; j++)
sum1++;
• Complexity:
O(nlogn)
47
Example
• Code:
sum2 = 0;
for (k=1; k<=n; k*=2)
for (j=1; j<=k; j++)
sum2++;
• Complexity:
O(n)
48
Classifying functions by their
Asymptotic Growth Rates
49
Asymptotic notations
• O-notation
50
Big-O
51
Examples
– 2n = O(n ):
2 3 2n 2
≤ cn 3
⇒ 2 ≤ cn ⇒ c = 1 and
n 0= 2
– n2 = O(n2): n2 ≤ cn2 ⇒ c ≥ 1 ⇒ c = 1 and
0n=1
– 1000n2+1000n = O(n2):
1000n2+1000n ≤ 1000n 2
+ n 2
=1001n 2
⇒ c=1001 and
– n = O(n2):
n0 = 1000
n ≤ cn2 ⇒ cn ≥ 1 ⇒ c = 1 and
n 0= 1 52
More Big-O
● Prove that:
● Let c = 21 and n0 = 4
● 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE
53
Examples
• Show that
• Let c = 2 and n0 = 5
54
More Examples
• Show that 30n+8 is O(n).
– Show ∃c,n0: 30n+8 ≤ cn, ∀n>n0 .
• Let c=31, n0=8. Assume n>n0=8. Then
cn = 31n = 30n + n > 30n+8, so 30n+8 <
cn.
55
Big-O example, graphically
• Note 30n+8 isn’t
less than n
cn =
anywhere (n>0).
31n 30n+8
• It isn’t even
function →
Value of
less than 31n
everywhere. n
30n+8
• But it is less than ∈O(n)
56
31n everywhere to n>n0=8
the right of n=8. →
Increasing n
→
No Uniqueness
for all n ≥ 5
58
Asymptotic notations
• Ω - notation
59
Examples
– 5n2 = Ω(n)
∃ c, n0 such that: 0 ≤ cn ≤
⇒ cn ≤ ⇒ c = 5 and n0
– 5n
100n
2 + 5 ≠ Ω(n2) 5n2 =1
∃ c, n0 such that: 0 ≤ cn2 ≤
100n + 5 ≤ 100n + 5n (∀ n ≥ 1)
=
cn105n
2
≤ ⇒ n(cn – 105) ≤
105n
Since n is0positive ⇒ cn – 105 ⇒ n ≤
– 2n⇒
≤ 0 n3 = Ω(n2),nn cannot
= contradiction:
Ω(n), be smaller105/c
= Ω(logn) than a
constant
60
Asymptotic notations (cont.)
• Θ-notation
61
Examples
• ½ n2 - ½ n ≤ ½ n2 ∀n ≥ 0 ⇒ c2= ½
• ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( ∀n ≥ 2 ) = ¼ n2
• ⇒ c1= ¼
– n ≠ Θ(n2):
c1 n2 ≤ n ≤ c2 n2
● Let c1 = 21 and n0 = 10
● 21n3 > 20n3 + 7n + 1000 for all n > 10
n3 > 7n + 1000 for all n > 10
TRUE, but we also need…
● Let c2 = 20 and n0 = 1
● 20n3 < 20n3 + 7n + 1000 for all n ≥ 1
TRUE
63
Relations Between Different Sets
• Subset relations between order-of-growth sets.
O( f ) R→R Ω( f )
•f
Θ( f )
64
Logarithms and properties
65
Simplifying Assumptions
• Arithmetic series:
• Geometric series:
• Harmonic series:
f(n) = Ω(g(n))
69
More Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which
relationship is correct.
f(n) = Ω(g(n))
– f(n) = n log n + n; g(n) = log n
– f(n) = 10; g(n) = log 10 f(n) = Θ(g(n))
– f(n) = 2n; g(n) = 10n2 f(n) = Ω(g(n))
– f(n) = 2n; g(n) = 3n
f(n) = O(g(n))
70
Asymptotic Notations - Examples
• O notation
– 2n vs. n
2 3 2n2 = O(n3)
– n vs. n
2 2 n2 = O(n2)
– n3 vs. nlogn n3 ≠ O(nlogn)
71
Asymptotic Notations - Examples
• Θ notation
– n2/2 – n/2 = Θ(n2)
– (6n3 + 1)lgn/(n + 1)
– n vs. n2 = Θ(n2lgn)
n ≠ Θ(n2)
72
Asymptotic Notations - Examples
• Ω notation
– n3 vs. n2
n3 = Ω(n2)
– n vs. logn
– n vs. n2
n = Ω(logn)
n ≠ Ω(n2)
73
Summary
● Time complexity is a measure of algorithm efficiency
● Efficient algorithm plays the major role in determining the
running time.
Q: Is it possible to determine running time based on
algorithm’s time complexity alone?
● Minor tweaks in the code can cut down the running time by a
factor too.
● Other items like CPU speed, memory speed, device I/O speed
can help as well.
● For certain problems, it is possible to allocate additional space
74
& improve time complexity.