0% found this document useful (0 votes)
93 views

Analysis of Algorithm

This document discusses complexity analysis of algorithms. It defines algorithm analysis as measuring the efficiency of an algorithm by analyzing the resources like time and space required by the algorithm. Time complexity is the most common measure of efficiency used, which measures the running time of an algorithm based on the size of the input. The document explains different techniques to analyze time complexity like worst case, average case and best case analysis. It also provides rules and examples to calculate the asymptotic time complexity of algorithms using Big-O notation.

Uploaded by

Reddy Babu
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views

Analysis of Algorithm

This document discusses complexity analysis of algorithms. It defines algorithm analysis as measuring the efficiency of an algorithm by analyzing the resources like time and space required by the algorithm. Time complexity is the most common measure of efficiency used, which measures the running time of an algorithm based on the size of the input. The document explains different techniques to analyze time complexity like worst case, average case and best case analysis. It also provides rules and examples to calculate the asymptotic time complexity of algorithms using Big-O notation.

Uploaded by

Reddy Babu
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 51

Complexity Analysis

Analysis of Algorithms
• Algorithm is a finite sequence of steps that the computer
follows to solve a problem.
• When you face with a problem, the first thing you need to do
is to find out an algorithm to solve it.
• Once you determine the algorithm to be correct, next step is
to find out the resources (time and space) the algorithm will
require. This is known as algorithm analysis.
• Thus, an algorithm analysis measures the efficiency of the
algorithm.
• If your computer requires more resources than your
computer has, it is useless.

2
What is efficiency?
• Efficiency can be measured by time or space utilization by the
algorithm.
• An efficient algorithm is that which has
– Small running time (Time complexity): The time taken
by a program is the sum of the compiled time & the run
time. The time complexity of an algorithm is given by the
number of steps taken by the algorithm to compute the
function it was written for.
• Small space used (Space complexity): The space
complexity of an algorithm is the amount of memory it
needs to run.
We will consider only the time complexity.

3
What is good algorithm?
• A good algorithm is like a sharp knife-it does exactly what it is
supposed to do with a minimum amount of applied effort.

4
How to Measure Running Time of Algorithm?
1. Experimental study (run programs)
2. Growth function: Asymptotic Algorithm Analysis
Factors affecting running time
• Speed: Speed of the machine running the program.
• Language: Language in which the program was written. For example,
programs written in assembly language generally run faster than those
written in C or C++, which in turn tend to run faster than those written in
Java.
• Compiler: Efficiency of the compiler that created the program
• The size of the input: processing 1000 records will take more time than
processing 10 records.
• Organization of the input: if the item we are searching for is at the top
of the list, it will take less time to find it than if it is at the bottom.

5
Types of Algorithm analysis

Worst-case: (usually)
• T(n) = maximum time of algorithm on any input of
size n.

Average-case: (sometimes)
• T(n) = expected time of algorithm over all inputs of
size n.
• Need assumption of statistical distribution of inputs.

Best-case:
• Cheat with a slow algorithm that works fast on some
input.
General Rules for Estimation
• Simple statements: We assume that statement does not
contain a function call. It takes a fixed amount to execute.
• Sequence of simple statements: It takes an amount of
execution time equal to the sum of execution times of
individual statements. If the performance of individual
statements are
• Decision: For estimating the performance, then and else
parts of the algorithm are considered independently. The
performance estimate of the decision is taken to be the
largest of the two individual.

7
General Rules for Estimation
• Loops: The running time of a loop is at most the running time
of the statements inside of that loop times the number of
iterations.
Ex: for(i=0; i<N; i++)
This loop will run for N times.
• Nested Loops: Running time of a nested loop containing a
statement in the inner most loop is the running time of statement
multiplied by the product of the sized of all loops.
Ex:
for (i=0; i< N; i++) {
for (j=0; j< N; i++) {
sequence of simple statements
}}
This loop will run for N*N times.
8
The Running Time of Algorithms
• Each operation in an algorithm (or a program) has a cost.
 Each operation takes a certain of time.

count = count + 1;  take a certain amount of time, but it is


constant

A sequence of operations:

count = count + 1; Cost: c1


sum = sum + count; Cost: c2

 Total Cost = c1 + c2
9
The Running Time of Algorithms (cont.)
Example: Simple If-Statement
Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1

Total Cost <= c1 + max(c2,c3)

10
The Running Time of Algorithms (cont.)
Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}

Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5


 The time required for this algorithm is proportional to n

11
The Running Time of Algorithms (cont.)
Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2
12
Big – O notation
• It is most commonly used notation for specifying asymptotic
complexity i.e rate of function growth.
• It refers to upper bound of functions.

• For example:

lim n->∞ f(n) / g(n) = c //`c closer to 0`


Graph for O Notation
Basic rules for finding Big - O
1. Nested loops are multiplied together.
2. Sequential loops are added.
3. Only the largest term is kept, all others are
dropped.
4. Constants are dropped.
5. Conditional checks are constant (i.e. 1).
Example 1
//linear
for(int i = 0; i < n; i++) {
cout << i << endl;
}
• Ans: O(n)
Example 2
//quadratic
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++){
//do swap stuff, constant time
}
}
• Ans O(n^2)
Example 3
//quadratic
for(int i = 0; i < n; i++) {
for(int j = 0; j < i; j++){
//do swap stuff, constant time
}
}
• Ans: (n(n+1)/2). This is still in the bound of
O(n^2)
Example 4
for(int i = 0; i < 2*n; i++) {
cout << i << endl;
}
• At first you might say that the upper bound is
O(2n); however, we drop constants so it
becomes O(n)
Example 5
//linear
for(int i = 0; i < n; i++) {
cout << i << endl;
}

//quadratic
for(int i = 0; i < n; i++) {
for(int j = 0; j < i; j++){
//do constant time stuff
}
}
• Ans : In this case we add each loop's Big O, in
this case n+n^2. O(n^2+n) is not an acceptable
answer since we must drop the lowest term.
The upper bound is O(n^2). Why? Because it
has the largest growth rate
Example 6
for(int i = 0; i < n; i++) {
for(int j = 0; j < 2; j++){
//do stuff
}
}
• Ans: Outer loop is 'n', inner loop is 2, this we
have 2n, dropped constant gives up O(n)
Example 7
for(int i = 1; i < n; i *= 2) {
cout << i << endl;
}
• There are n iterations, however, instead of
simply incrementing, 'i' is increased by 2*itself
each run. Thus the loop is log(n).
Example 8
for(int i = 0; i < n; i++) { //linear
for(int j = 1; j < n; j *= 2){ // log (n)
//do constant time stuff
}
}
• Ans: n*log(n)
Typical Complexities
Complexity Notation Description
Constant number of
operations, not depending on
constant O(1)
the input data size, e.g.
n = 1 000 000  1-2 operations
Number of operations propor-
tional of log2(n) where n is the
logarithmic O(log n)
size of the input data, e.g. n = 1
000 000 000  30 operations
Number of operations
proportional to the input data
linear O(n)
size, e.g. n = 10 000  5 000
operations
Typical Complexities
Complexity Notation Description
Number of operations
proportional to the square of
quadratic O(n2)
the size of the input data, e.g.
n = 500  250 000 operations
Number of operations propor-
tional to the cube of the size of
cubic O(n3)
the input data, e.g. n =
200  8 000 000 operations
O(2n), Exponential number of
exponential O(kn), operations, fast growing, e.g. n
O(n!) = 20  1 048 576 operations
Time Complexity and Speed
Complexity 10 20 50 100 1 000 10 000 100 000
O(1) <1s <1s <1s <1s <1s <1s <1s
O(log(n)) <1s <1s <1s <1s <1s <1s <1s
O(n) <1s <1s <1s <1s <1s <1s <1s
O(n*log(n)) <1s <1s <1s <1s <1s <1s <1s
O(n2) <1s <1s <1s <1s <1s 2s 3-4 min
O(n3) <1s <1s <1s <1s 20 s 5 hours 231 days
260
O(2n) <1s <1s hangs hangs hangs hangs
days
O(n!) <1s hangs hangs hangs hangs hangs hangs
O(nn) 3-4 min hangs hangs hangs hangs hangs hangs
Practical Examples
• O(n): printing a list of n items to the screen, looking
at each item once.
• O(log n): taking a list of items, cutting it in half
repeatedly until there's only one item left.
• O(n^2): taking a list of n items, and comparing every
item to every other item.
How to determine Complexities
Example 1
• Sequence of statements
statement 1;
statement 2;
...
statement k;

• total time = time(statement 1) + time(statement 2) + ... + time(statement k)

• If each statement is "simple" (only involves basic operations) then the


time for each statement is constant and the total time is also constant:
O(1).
Example 2
• if-then-else statements
if (condition)
{
sequence of statements 1
}
else
{
sequence of statements 2
}

• Here, either sequence 1 will execute, or sequence 2 will execute.


• Therefore, the worst-case time is the slowest of the two possibilities:
max(time(sequence 1), time(sequence 2)).
• For example, if sequence 1 is O(N) and sequence 2 is O(1) the worst-case
time for the whole if-then-else statement would be O(N).
Example 3
• for loops
for (i = 0; i < N; i++)
{
sequence of statements
}

• The loop executes N times, so the sequence of


statements also executes N times.
• Since we assume the statements are O(1), the total
time for the for loop is N * O(1), which is O(N) overall.
Example 4
for (i = 0; i < N; i++)
{
for (j = 0; j < M; j++)
{
sequence of statements ;
}
}

• The outer loop executes N times. Every time the outer loop executes, the
inner loop executes M times. As a result, the statements in the inner loop
execute a total of N * M times. Thus, the complexity is O(N * M).
• In a common special case where the stopping condition of the inner loop
is j < N instead of j < M (i.e., the inner loop also executes N times), the
total complexity for the two loops is O(N2).
Complexity Examples
int FindMaxElement(int array[])
{
int max = array[0];
for (int i=0; i<n; i++)
{
if (array[i] > max)
{
max = array[i];
}
}
return max;
}

• Runs in O(n) where n is the size of the array


• The number of elementary steps is ~ n
Complexity Examples (2)
long FindInversions(int array[])
{
long inversions = 0;
for (int i=0; i<n; i++)
for (int j = i+1; j<n; i++)
if (array[i] > array[j])
inversions++;
return inversions;
}

• Runs in O(n2) where n is the size of the array


• The number of elementary steps is ~
n*(n+1) / 2
Complexity Examples (3)
decimal Sum3(int n)
{
decimal sum = 0;
for (int a=0; a<n; a++)
for (int b=0; b<n; b++)
for (int c=0; c<n; c++)
sum += a*b*c;
return sum;
}

• Runs in cubic time O(n3)


• The number of elementary steps is ~ n3
Complexity Examples (4)
long SumMN(int n, int m)
{
long sum = 0;
for (int x=0; x<n; x++)
for (int y=0; y<m; y++)
sum += x*y;
return sum;
}

• Runs in quadratic time O(n*m)


• The number of elementary steps is ~ n*m
Complexity Examples (5)
long SumMN(int n, int m)
{
long sum = 0;
for (int x=0; x<n; x++)
for (int y=0; y<m; y++)
if (x==y)
for (int i=0; i<n; i++)
sum += i*x*y;
return sum;
}

• Runs in quadratic time O(n*m)


• The number of elementary steps is
~ n*m + min(m,n)*n
Big - Ω notation
• It refers to lower bound of functions.

• Example :
– 5n^2 is Ω(n) because 5n^2 ≥ 5n for n ≥ 1.

lim n->∞ f(n) / g(n) = c //`c closer to ∞`


Graph for Omega Notation
Ө notation
• It refers to tight bound of functions.

• Informally, if f(n) is Θ(g(n)) then both the functions have


the same rate of increase.

• Example:
The same rate of increase for
f(n) = n + 5n^0.5 and g(n) = n
because
n ≤ (n + 5n^0.5) ≤ 6n for n > 1
Theta notation

You might also like