3 Examples
3 Examples
ALGORITHM
The efficiency of algorithm can be specified using
Time Complexity
Means amount of time taken by an algorithm to run.
Space Complexity
Means amount of space(memory) taken by an algorithm.
IMPORTANCE OF EFFICIENCY
ANALYSIS
Performing efficiency analysis is important for these
following two reasons:
1. By computing the time complexity we come to
know whether algorithm is slow or fast.
2. By computing the space complexity we can
analyze whether an algorithm requires more or
less space.
TIME COMPLEXITY
The amount of time taken by an
algorithm to run, as a function of length
of the input.
CONCEPT OF FREQUENCY
COUNT
The time complexity of an algorithm can be computed using the
frequency count.
The frequency count is a count that denotes how many times
particular statement is executed.
Example:
The frequency count of given program is 2.
EXAMPLE 1
void fun()
{
int a;
a=0;
for(i=0; i<n; i++)
{
a =a +1;
}
}
EXAMPLE 2
EXAMPLE 3
EXAMPLE 4
COMMON TIME COMPLEXITY
CLASSES
1. Constant Time
2. Linear Time
3. Quadratic Time
4. Logarithmic Time
5. “N log N” Time
6. Exponential Time
COMMON TIME COMPLEXITY
CLASSES
O(1)CONSTANT TIME
This means that the algorithm requires the same fixed number of steps regar
dless of the size of the task.
Example:
1)a statement involving basic operations
Here are some examples of basic operations.
one arithmetic operation (e.g., +, *)
one assignment one test (e.g., x==0)
one read(accessing an element from an array)
Sequence of statements involving basic operations.
statement 1;
statement 2;
..........
statement k;
Time for each statement is constant and the total time is also constant: O
(1)
EXAMPLE
ALGORITHM SWAP(a,b)
{
temp=a; -----------(1)
a=b; -----------(1)
b=temp; -----------(1)
}
f(n)=1+1+1=3
Thus time complexity in Big – O notation will be O(1).
O(N)LINEAR TIME
This means that the algorithm requires a num
ber of steps proportional to the size of the tas
k.
Examples:
1. Traversing an array.
2. Sequential/Linear search in an array.
3. Best case time complexity of Bubble sort (
i.e when the elements of array are in sorted
order).
O(N)LINEAR TIME
Basic structure is :
for (i = 0; i < n; i++)
{
statements;
}
The loop executes N times, so the total time is N*O(1) which is O(N)
.
EXAMPLE
ALGORITHM SUM (A,n)
{
s=0; --------------------1
for(i=0;i<n;i++)---------n+1
{
s=s+A[i];--------n
}
}
f(n)=2n+2
Time Complexity will be O(n).
OQUADRATIC TIME
The number of operations is proportional to the size of the task squared.
Examples:
1) Worst case time complexity of Bubble, Selection and Insertion sort.
Nested loops:
for (i = 0; i < N; i++) {
for (j = 0; j < M; j++) {
sequence of statements of O(1)
}
}
The outer loop executes N times and inner loop executes M times so the time c
omplexity is O(N*M).
EXAMPLE
for (i = 0; i< N; i++) {
for (j = 0; j < N; j++) {
sequence of statements of O(1)
}
}
Now the time complexity is O()
EXAMPLE
let's consider nested loops where the number of iterations of the inner
loop depends on the value of the outer loop's index.
for (i = 0; i< N; i++) {
for (j = i+1; j < N; j++) {
sequence of statements of O(1)
}
}
So the total number of times the “sequence of statements” within the tw
o loops executes is :
(N1)+(N2)+.....2+1+0 which is N*(N1)/2 or (1/2)*(N^2) (1/2)* N
and we can say that it is O().
O(LOG N) LOGARITHMIC
TIME
Examples:
1. Binary search in a sorted array of n elements.