Algo Chapter One Part Two
Algo Chapter One Part Two
1
Algorithm analysis refers to the process of
determining how much computing time and
storage that algorithms will require.
2
In order to solve a problem, there are many possible algorithms.
One has to be able to choose the best algorithm for the problem at hand
using some scientific method.
To classify some data structures and algorithms as good:
we need precise ways of analyzing them in
terms of resource requirement.
3
The main resources are:
• Running Time
• Memory Usage
• Communication Bandwidth
Note:
Running time is the most important since computational time is the most
precious resource in most problem domains.
4
There are two approaches to measure the efficiency of
algorithms:
1. Empirical
based on the total running time of the program.
Uses actual system clock time.
Example:
t1
for(int i=0; i<=10; i++)
cout<<i;
t2
Running time taken by the above algorithm
(TotalTime)
5
= t2-t1;
It is difficult to determine efficiency of algorithms using
this approach, because clock-time can vary based on many
factors.
t1
for(int i=0; i<=n; i++)
cout<<i;
t2
T=t2-t1;
For n=100, T>=0.5s
n=1000, T>0.5s
7
Programming Language
C (fastest), C++ (faster), Java (fast)
C is relatively faster than Java, because C is relatively nearer to Machine
language, so, Java takes relatively larger amount of time for
interpreting/translation to machine code.
Operating System
Multitasking Vs Single tasking
Internal structure
8
2. Theoretical
Determining the quantity of resources required using mathematical concept.
9
We use theoretical approach to determine the efficiency
of algorithm because:
• The number of operation will not vary under different conditions.
10
Complexity Analysis
11
Two important ways to characterize the effectiveness of
an algorithm are its Space Complexity and Time
Complexity.
Time Complexity: Determine the approximate amount of time (number of
operations) required to solve a problem of size n.
12
Complexity analysis involves two distinct phases:
• Algorithm Analysis: Analysis of the algorithm or data structure to produce a
function T (n) that describes the algorithm in terms of the operations
performed in order to measure the complexity of the algorithm.
Example: Suppose we have hardware capable of executing 106
instructions per second. How long would it take to execute an
algorithm whose complexity function is T(n)=2n2 on an input
size of n=108?
Solution: T(n)= 2n2=2(108)2 = 2*1016
Running time=T(108)/106=2*1016/106=2*1010 seconds.
• Order of Magnitude Analysis: Analysis of the function T (n) to determine the
general complexity category to which it belongs.
13
There is no generally accepted set of rules for algorithm
analysis.
However, an exact count of operations is commonly
used.
To count the number of operations we can use the
following Analysis Rule.
Analysis Rules:
1. Assume an arbitrary time unit.
2. Execution of one of the following operations
takes 1 time unit:
Assignment Operation
Example: i=0;
Single Input/Output Operation
Example: cin>>a;
14
cout<<“hello”;
Single Boolean Operations
Example: i>=10
Single Arithmetic Operations
Example: a+b;
Function Return
Example: return sum;
int sum=0;
if(a>b)
{
sum= a+b;
cout<<sum;
}
else
{
cout<<b;
}
T(n)16= 1 +1+max(3,1)
=5
4. Loop statements:
• The running time for the statements inside the loop * number of iterations +
time for setup(1) + time for checking (number of iteration + 1) + time for
update (number of iteration).
• The total running time of statements inside a group of nested loops is the
running time of the statements * the product of the sizes of all the loops.
• For nested loops, analyze inside out.
• Always assume that the loop executes the maximum number of iterations
possible. (Why?)
Because we are interested in the worst case
complexity.
17
Examples:
a) int k=0,n;
cout<<“Enter an integer”;
cin>>n
for(int i=0;i<n; i++)
k++;
T(n)= 3+1+n+1+n+n=3n+5
b) int sum=0;
for(i=1;i<=n;i++)
sum=sum+i;
T(n)=1+1+(n+1)+n+(1+1)n
18 =3+4n
c)
int i=0; 1
while(i<n) i=0 i=n n+1
{
cout<<i; n
i++; n
}
int j=1; 1
while(j<=10) j=1 j=11 11
{
cout<<j; 10
j++; 10
}
T(n)=1+n+1+n+n+1+11+2(10)
19
= 3n+34
d)
int k=0; 1
for(int i=1 ; i<=n; i++)1+n+1+n
for( int j=1; j<=n; j+
+)1+n+1+n
k++; n
1+1+n+1+n+n(1+n+1+n+n)
T(n)=1+1+(n+1)+n+n(1+(n+1)+n+n)
= 2n+3+n(3n+2)
= 2n+3+3n2+2n
= 3n2+4n+3
20
e)
int sum=0;
for(i=0;i<n;i++)
for(j=0;j<n;j++)
sum++;
T(n) =1+1+(n+1)+n+n*(1+(n+1)+n+n)
=3+2n+n2+2n+2n2
=3+2n+3n2+2n
=3n2+4n+3=O(n2)
21
5. Function call:
• 1 for function calling + the time for any parameter calculations + the time
required for the execution of the function body.
Example:
a) int counter()
{
int a=0;
cout<<”Enter a number”;
cin>>n;
for(i=0;i<n;i++)
a=a+1;
return 0;
}
T(n)=1+1+1+(1+n+1+n)+2n+1
22 =4n+6=O(n)
b) void func( )
{
int x=0; int i=0;int j=1;
cout<<”Enter a number”;
cin>>n;
while(i<n)
{
i=i+1;
}
while(j<n)
{
j=j+1;
}
}
T(n)=1+1+1+1+1+n+1+2n+n+2(n-1)
23 = 6+4n+2n-2
=4+6n=O(n)
c)
int sum(int n)
{
int s=0;
for(int i=1;i<=n;i++)
s=s+(i*i*i*i);
return s;
}
T(n)=1+(1+n+1+n+5n)+1
=7n+4=O(n)
24
??
25
Formal Approach to Analysis
f
or(
in
ti=1
;i<
=N;i
++){
s
um =
su
m +
i
;
1N
}
27
i1
Nested Loops: Formally
Nested for loops translate into multiple summations, one for each For
loop.
}
sum= sum+i+j; 2 2M 2MN
i1 j1 i1
}
28
Consecutive Statements: Formally
for (int i = 1; i <= N; i++) {
Add the running times of the separate blocks of
sum = sum+i;
} your code. N N N
}}
sum=sum+i;
Example: max
1, 2
i1 i1
j1
elsefor (inti =1; i <=N; i++) {
for (int j =1; j <=N; j++) { max 2N
N, 2N 2 2
sum=sum+i+j;
}}
30
Recursive: Formally
algorithm.
.
Takes the smallest possible set of inputs
Causes execution of the fewest number of statements.
32
Computes the lower bound of T(n), where T(n) is
the complexity function.
Examples:
For sorting algorithm
If the list is already sorted (data are arranged in the required order).
For searching algorithm
If the desired item is located at first accessed position.
33
Worst Case Analysis:
Assumes the input data are arranged in the most
disadvantageous order for the algorithm.
Takes the worst possible set of inputs.
Causes execution of the largest number of statements.
Computes the upper bound of T(n) where T(n) is the
complexity function.
Example:
For sorting Algorithm, if the list is in opposite order.
35
.
Examples:
For sorting algorithms
While sorting, considering any arrangement
(order of input data).
For searching algorithms
While searching, if the desired item is located
at any location or is missing.
36
Summary:
Worst case analysis is the most common analysis
because:
It provides the upper bound for all input (even for bad ones).
• Average case analysis is often difficult to determine
and define.
• If situations are in their best case, no need to
develop algorithms because data arrangements are
in the best situation.
• Best case analysis can not be used to estimate
complexity.
• We are interested in the worst case time since it
provides a bound for all input-this is called the
“Big-Oh” estimate.
37
Order of Magnitude
Refers to the rate at which the storage or time
grows as a function of problem size.
1. Big-Oh Notation
Definition: We say f(n)=O(g(n)), if there are positive
constants n0 and c, such that to the right of n0, the
value of f(n) always lies on or below c.g(n).
As n increases f(n) grows no faster than g(n).
It’s only concerned with what happens for very
large values of n.
Describes the worst case analysis.
Gives an upper bound for a function to within a
constant factor.
40
O-Notations are used to represent the amount of
time an algorithm takes on the worst possible set
of inputs, “Worst-Case”.
41
Question-1
Big-O Theorems
44
Big-O Theorems
Theorem 4(Transitivity): If f(n) is O(g(n))and g(n) is O(h(n)), then f(n) is O(h(n)).
46
Rules to estimate Big Oh of a given function
Example:
1. T(n)=3n + 5 O(n)
2. T(n)=3n2+4n+2 O(n2)
7n!+2n+n2+1 n! T(n)=O(n!)
8nn+254
n
+n2 +3 nn T(n)=O(nn)
55
The order of the body statements of a given
algorithm is very important in determining Big-Oh
of the algorithm.
T(n)=2*n=2n=O(n).
56
for(int i=1; i<=n; i++)
for(int j=1; j<=n; j++)
k++;
T(n)=1*n*n=n2 = O(n2).
57
??
58