0% found this document useful (0 votes)
16 views

Algo Chapter One Part Two

Uploaded by

abenuzzion
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Algo Chapter One Part Two

Uploaded by

abenuzzion
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 52

Chapter Two

Algorithm Analysis Concept

1
Algorithm analysis refers to the process of
determining how much computing time and
storage that algorithms will require.

 In other words, it’s a process of predicting


the resource requirement of algorithms in a
given environment.

2
 In order to solve a problem, there are many possible algorithms.
 One has to be able to choose the best algorithm for the problem at hand
using some scientific method.
 To classify some data structures and algorithms as good:
 we need precise ways of analyzing them in
terms of resource requirement.

3
 The main resources are:
• Running Time
• Memory Usage
• Communication Bandwidth

Note:
Running time is the most important since computational time is the most
precious resource in most problem domains.

4
 There are two approaches to measure the efficiency of
algorithms:
1. Empirical
 based on the total running time of the program.
 Uses actual system clock time.
Example:
t1
for(int i=0; i<=10; i++)
cout<<i;
t2
Running time taken by the above algorithm
(TotalTime)
5
= t2-t1;
 It is difficult to determine efficiency of algorithms using
this approach, because clock-time can vary based on many
factors.

Processor speed of the computer


1.78GHz 2.12GHz
10s <10s
Current processor load
 Only the work 10s
 With printing 15s
 With printing & browsing the internet >15s
6
Input size

t1
for(int i=0; i<=n; i++)
cout<<i;
t2
T=t2-t1;
For n=100, T>=0.5s
n=1000, T>0.5s
7
Programming Language
 C (fastest), C++ (faster), Java (fast)
 C is relatively faster than Java, because C is relatively nearer to Machine
language, so, Java takes relatively larger amount of time for
interpreting/translation to machine code.

Operating System
 Multitasking Vs Single tasking

 Internal structure

8
2. Theoretical
 Determining the quantity of resources required using mathematical concept.

 Analyze an algorithm according to the number of basic operations (time units)


required, rather than according to an absolute amount of time involved.

9
 We use theoretical approach to determine the efficiency
of algorithm because:
• The number of operation will not vary under different conditions.

• It helps us to have a meaningful measure that permits comparison of algorithms


independent of operating platform.

• It helps to determine the complexity of algorithm.

10
Complexity Analysis

 Complexity Analysis is the systematic study of the


cost of computation, measured either in:
 Time units
 Operations performed, or
 The amount of storage space required.

11
 Two important ways to characterize the effectiveness of
an algorithm are its Space Complexity and Time
Complexity.
 Time Complexity: Determine the approximate amount of time (number of
operations) required to solve a problem of size n.

 Space Complexity: Determine the approximate memory required to solve a


problem of size n.

12
 Complexity analysis involves two distinct phases:
• Algorithm Analysis: Analysis of the algorithm or data structure to produce a
function T (n) that describes the algorithm in terms of the operations
performed in order to measure the complexity of the algorithm.
Example: Suppose we have hardware capable of executing 106
instructions per second. How long would it take to execute an
algorithm whose complexity function is T(n)=2n2 on an input
size of n=108?
Solution: T(n)= 2n2=2(108)2 = 2*1016
Running time=T(108)/106=2*1016/106=2*1010 seconds.
• Order of Magnitude Analysis: Analysis of the function T (n) to determine the
general complexity category to which it belongs.

13
 There is no generally accepted set of rules for algorithm
analysis.
 However, an exact count of operations is commonly
used.
 To count the number of operations we can use the
following Analysis Rule.
Analysis Rules:
1. Assume an arbitrary time unit.
2. Execution of one of the following operations
takes 1 time unit:
 Assignment Operation
Example: i=0;
 Single Input/Output Operation
Example: cin>>a;
14
cout<<“hello”;
 Single Boolean Operations
Example: i>=10
 Single Arithmetic Operations
Example: a+b;
 Function Return
Example: return sum;

3. Running time of a selection statement (if, switch) is


the time for the condition evaluation plus the
maximum of the running times for the individual
clauses in the selection.
15
Example:

int sum=0;
if(a>b)
{
sum= a+b;
cout<<sum;
}
else
{
cout<<b;
}

T(n)16= 1 +1+max(3,1)
=5
4. Loop statements:
• The running time for the statements inside the loop * number of iterations +
time for setup(1) + time for checking (number of iteration + 1) + time for
update (number of iteration).
• The total running time of statements inside a group of nested loops is the
running time of the statements * the product of the sizes of all the loops.
• For nested loops, analyze inside out.
• Always assume that the loop executes the maximum number of iterations
possible. (Why?)
 Because we are interested in the worst case
complexity.

17
Examples:
a) int k=0,n;
cout<<“Enter an integer”;
cin>>n
for(int i=0;i<n; i++)
k++;

T(n)= 3+1+n+1+n+n=3n+5

b) int sum=0;
for(i=1;i<=n;i++)
sum=sum+i;

T(n)=1+1+(n+1)+n+(1+1)n
18 =3+4n
c)
int i=0; 1
while(i<n) i=0 i=n n+1
{
cout<<i; n
i++; n
}
int j=1; 1
while(j<=10) j=1 j=11 11
{
cout<<j; 10
j++; 10
}

T(n)=1+n+1+n+n+1+11+2(10)
19
= 3n+34
d)
int k=0; 1
for(int i=1 ; i<=n; i++)1+n+1+n
for( int j=1; j<=n; j+
+)1+n+1+n
k++; n
1+1+n+1+n+n(1+n+1+n+n)
T(n)=1+1+(n+1)+n+n(1+(n+1)+n+n)
= 2n+3+n(3n+2)
= 2n+3+3n2+2n
= 3n2+4n+3
20
e)
int sum=0;
for(i=0;i<n;i++)
for(j=0;j<n;j++)
sum++;

T(n) =1+1+(n+1)+n+n*(1+(n+1)+n+n)
=3+2n+n2+2n+2n2
=3+2n+3n2+2n
=3n2+4n+3=O(n2)

21
5. Function call:
• 1 for function calling + the time for any parameter calculations + the time
required for the execution of the function body.

Example:
a) int counter()
{
int a=0;
cout<<”Enter a number”;
cin>>n;
for(i=0;i<n;i++)
a=a+1;
return 0;
}
T(n)=1+1+1+(1+n+1+n)+2n+1
22 =4n+6=O(n)
b) void func( )
{
int x=0; int i=0;int j=1;
cout<<”Enter a number”;
cin>>n;
while(i<n)
{
i=i+1;
}
while(j<n)
{
j=j+1;
}
}

T(n)=1+1+1+1+1+n+1+2n+n+2(n-1)
23 = 6+4n+2n-2
=4+6n=O(n)
c)

int sum(int n)
{
int s=0;
for(int i=1;i<=n;i++)
s=s+(i*i*i*i);
return s;
}

T(n)=1+(1+n+1+n+5n)+1
=7n+4=O(n)

24
??

25
Formal Approach to Analysis

 In the above examples we have seen that analyzing Loop


statements is so complex.

 It can be simplified by using some formal approach in


which case we can ignore initializations, loop controls,
and updates.

Simple Loops: Formally


 For loop can be translated to a summation .
26
 The index and bounds of the summation are the same as the index and
bounds of the for loop.

 Suppose we count the number of additions that are done. There is 1


addition per iteration of the loop, hence n additions in total.


f
or(
in
ti=1
;i<
=N;i
++){
s
um =
su
m +
i
; 
1N
}
27 
i1
Nested Loops: Formally
 Nested for loops translate into multiple summations, one for each For

loop.

for (int i = 1; i <= N; i++) {


for (int j = 1; j <= M; j++) { N M N

}
sum= sum+i+j; 2  2M  2MN
i1 j1 i1
}
28
Consecutive Statements: Formally
for (int i = 1; i <= N; i++) {
 Add the running times of the separate blocks of
sum = sum+i;
} your code.  N  N N 

for (int i = 1; i <= N; i++) { 


 1    2  N
 i 1   i 1 j 1 
 2N 2

for (int j = 1; j <= N; j++) {


sum = sum+i+j;
}
} 29
if (test ==1) {
fo r (inti =1; i <=
Conditionals: N N
N; i++) { take maximum)
(Formally  N

}}
sum=sum+i;
Example: max   
 1, 2
 
 i1 i1
j1 

 
elsefor (inti =1; i <=N; i++) {
for (int j =1; j <=N; j++) { max 2N
N, 2N 2 2

sum=sum+i+j;
}}

30
Recursive: Formally

-Usually difficult to analyze.


Example: Factorial
long factorial(int n){
if(n<=1)
return 1;
else
return n*factorial(n-1);
}
T(n)=1+T(n-1)=2+T(n-2)=3+T(n-3)=
=n-1 (counting the number of multiplication)
31
Categories of Algorithm Analysis
 Algorithms may be examined under different situations to correctly
determine their efficiency for accurate comparison.

Best Case Analysis:


 Assumes the input data are arranged in the most advantageous order for the

algorithm.

.
Takes the smallest possible set of inputs
 Causes execution of the fewest number of statements.

32
 Computes the lower bound of T(n), where T(n) is
the complexity function.

Examples:
For sorting algorithm
 If the list is already sorted (data are arranged in the required order).
For searching algorithm
 If the desired item is located at first accessed position.

33
Worst Case Analysis:
 Assumes the input data are arranged in the most
disadvantageous order for the algorithm.
 Takes the worst possible set of inputs.
 Causes execution of the largest number of statements.
 Computes the upper bound of T(n) where T(n) is the
complexity function.

Example:
 For sorting Algorithm, if the list is in opposite order.

 For searching Algorithm, if the desired item is


located at the last position or is missing.
34
Average Case Analysis:
 Determine the average of the running time overall
permutation of input data.
 Takes an average set of inputs.

 It also assumes random input size.


 It causes average number of executions.
 Computes the optimal bound of T(n) where T(n) is
the complexity function.
 Sometimes average cases are as bad as worst cases
and as good as best cases.

35
.
Examples:
For sorting algorithms
 While sorting, considering any arrangement
(order of input data).
For searching algorithms
 While searching, if the desired item is located
at any location or is missing.

36
Summary:
 Worst case analysis is the most common analysis
because:
 It provides the upper bound for all input (even for bad ones).
• Average case analysis is often difficult to determine
and define.
• If situations are in their best case, no need to
develop algorithms because data arrangements are
in the best situation.
• Best case analysis can not be used to estimate
complexity.
• We are interested in the worst case time since it
provides a bound for all input-this is called the
“Big-Oh” estimate.
37
Order of Magnitude
 Refers to the rate at which the storage or time
grows as a function of problem size.

 It is expressed in terms of its relationship to


some known functions.

 This type of analysis is called Asymptotic


analysis.
38
Asymptotic Notations
 Asymptotic analysis is concerned with how the

running time of an algorithm increases with the size


of the input in the limit, as the size of the input
increases without bound!
 Asymptotic Analysis makes use of O (Big-Oh) , 

(Big-Omega),  (Theta), o (little-o),  (little-omega)


- notations in performance analysis and
characterizing the complexity of an algorithm.

 Note: The complexity of an algorithm is a


numerical function of the size of the problem
(instance
39
or input size).
Types of Asymptotic Notations

1. Big-Oh Notation
Definition: We say f(n)=O(g(n)), if there are positive
constants n0 and c, such that to the right of n0, the
value of f(n) always lies on or below c.g(n).
 As n increases f(n) grows no faster than g(n).
 It’s only concerned with what happens for very
large values of n.
 Describes the worst case analysis.
 Gives an upper bound for a function to within a
constant factor.
40
 O-Notations are used to represent the amount of
time an algorithm takes on the worst possible set
of inputs, “Worst-Case”.
41
Question-1

f(n)=10n+5 and g(n)=n. Show that f(n) is O(g(n)).

To show that f(n) is O(g(n)), we must show that there exist


constants c and k such that f(n)<=c.g(n) for all n>= n0.

10n+5<=c.n  for all n>= n0


let c=15, then show that 10n+5<=15n
5<=5n or 1<=n
So, f(n)=10n+5<=15.g(n) for all n>=1
(c=15, n0 =1), there exist two constants that satisfy the above constraints.
42
Question-2

f(n)=3n2+4n+1. Show that f(n)=O(n2).

4n<=4n2 for all n>=1 and 1<=n2 for all n>=1


3n2+4n+1<=3n2+4n2+n2 for all n>=1
3n2+4n+1<=8n2 for all n>=1

So, we have shown that f(n)<=8n2 for all n>=1.

Therefore, f(n) is O(n2), (c=8, n0 =1), there exist two


constants that satisfy the constraints.
43
 Theorem 1: k is O(1)
 Theorem 2: A polynomial is O(the term
containing the highest power of n).
Polynomial’s growth rate is determined by the leading term
If f(n) is a polynomial of degree d, then f(n) is O(nd)
In general, f(n) is big-O of the dominant term of f(n).

Theorem 3: k*f(n) is O(f(n))


Constant factors may be ignored
E.g. f(n) =7n4+3n2+5n+1000 is O(n4)

Big-O Theorems
44
Big-O Theorems
 Theorem 4(Transitivity): If f(n) is O(g(n))and g(n) is O(h(n)), then f(n) is O(h(n)).

 Theorem 5: For any base b, logb(n) is O(logn).


All logarithms grow at the same rate
logbn is O(logdn) for b, d > 1
 Theorem 6: Each of the following functions is big-O of its successors:
K
logbn
n
nlogbn
n2
n to higher powers
2n
3n
larger constants to the nth power
n!
nn
45 f(n)= 3nlogbn + 4 logbn+2 is O(nlogbn) and )(n2) and O(2n)
Reading assignments

 Big-Omega ()-Notation (Lower bound)


 Theta Notation (-Notation) (Optimal bound)
 Little-Omega () notation
 Little-oh (small-oh) Notation

46
Rules to estimate Big Oh of a given function

 Pick the highest order


 Ignore the coefficient.

Example:
1. T(n)=3n + 5  O(n)
2. T(n)=3n2+4n+2  O(n2)

 Some known functions encountered when analyzing


algorithms. (Complexity category for Big-Oh). See
next slide 
53
T(n) Complexity Big-O
Category functions
F(n)
c, c is constant 1 C=O(1)

10logn + 5 logn T(n)=O(logn)


√n +2 √n T(n)=O(√n)
5n+3 n T(n)=O(n)
3nlogn+5n+2 nlogn T(n)=O(nlogn)

10n2 +nlogn+1 n2 T(n)=O(n2)


5n3 + 2n2 + 5 n3 T(n)=O(n3)
2n+n5+n+1 2n T(n)=O(2n)

7n!+2n+n2+1 n! T(n)=O(n!)

8nn+254
n
+n2 +3 nn T(n)=O(nn)
55
 The order of the body statements of a given
algorithm is very important in determining Big-Oh
of the algorithm.

Example: Find Big-Oh of the following algorithm.

for( int i=1;i<=n; i++)


sum=sum + i;

T(n)=2*n=2n=O(n).
56
for(int i=1; i<=n; i++)
for(int j=1; j<=n; j++)
k++;

T(n)=1*n*n=n2 = O(n2).

57
??

58

You might also like