Chapter-One
Chapter-One
Chapter one
1
University of Gondar Design and Analysis of Algorithms
Algorithm analysis refers to the process of determining how much computing time and
storage that algorithms will require. In other words, it’s a process of predicting the
resource requirement of algorithms in a given environment.
In order to solve a problem, there are many possible algorithms. One has to be able to
choose the best algorithm for the problem at hand using some scientific method. To
classify some data structures and algorithms as good, we need precise ways of analyzing
them in terms of resource requirement. The main resources are:
Running Time
Memory Usage
Communication Bandwidth
Running time is usually treated as the most important since computational time is the
most precious resource in most problem domains.
Complexity Analysis is the systematic study of the cost of computation, measured either
in time units or in operations performed, or in the amount of storage space required.
Complexity analysis is concerned with determining the efficiency of algorithms.
What is Efficiency depends on?
Execution speed (most important)
2
University of Gondar Design and Analysis of Algorithms
There is no generally accepted set of rules for algorithm analysis. However, an exact
count of operations is commonly used.
Analysis Rules:
1. We assume an arbitrary time unit.
2. Execution of one of the following operations takes time 1:
Assignment operation E.g. Sum=0;
Single Input/output operation E.g. cin>>sum; cout<<sum;
Single Boolean operations E.g. !done
Single Arithmetic operations E.g. a+b, a-b
Function Return E.g. return(sum);
3. Running time of a selection statement (if, switch) is the time for the condition
evaluation + the maximum of the running times for the individual clauses in the
selection.
4. Loop statement
∑(no of iteration of Body) +1 + n+1 + n (initialization time + checking
+ update)
5. Running time of a function call is 1 for setup + the time for any parameter
calculations + the time required for the execution of the function body.
Examples:
1. int count(){
int k=0;
cout<< “Enter an integer”;
cin>>n;
for (i=0;i<n;i++)
k=k+1;
3
University of Gondar Design and Analysis of Algorithms
return 0;}
Time Units to Compute
-------------------------------------------------
1 for the assignment statement: int k=0
1 for the output statement.
1 for the input statement.
In the for loop:
1 assignment, n+1 tests, and n increments.
n loops of 2 units for an assignment, and an addition.
1 for the return statement.
-------------------------------------------------------------------
T (n)= 1+1+1+(1+n+1+n)+2n+1 = 4n+6 = O(n)
2. int total(int n)
{
int sum=0;
for (int i=1;i<=n;i++)
sum=sum+1;
return sum;
}
Time Units to Compute
-------------------------------------------------
1 for the assignment statement: int sum=0
In the for loop:
1 assignment, n+1 tests, and n increments.
n loops of 2 units for an assignment, and an addition.
1 for the return statement.
T (n)= 1+ (1+n+1+n)+2n+1 = 4n+4 = O(n)
1. for(i=1;i<=n;i++)
for(j=1;j<=n; j++)
k++;
T(n)=1+n+1+n+n(1+n+1+n+n)=3n2+4n+2
2. Sum=0;
if(test==1){
for (i=1;i<=n;i++)
sum=sum+i
}
else{
cout<<sum;
}
T(n)=1+1+Max(1+n+1+n+2n,1)= 4n+4
4
University of Gondar Design and Analysis of Algorithms
Asymptotic Analysis is concerned with how the running time of an algorithm increases
with the size of the input in the limit, as the size of the input increases without bound. It
makes use of different notations in performance analysis and characterizing the
complexity of an algorithm which is a numerical function of the size of the problem.
I. O-Notation (Upper bound)
We say f(n)=O(g(n)), if there are positive constants no and c, such that to the right
of no, the value of f(n) always lies on or below c.g(n).
II. q-Notation
We say f(n)= q(g(n)) if there exist positive constants no, c1 and c2 such that to the
right of no, the value of f(n) always lies between c1.g(n) and c2.g(n) inclusive.
• Represent the amount of time the algorithm takes on an average set of inputs.
III. W-Notation (Lower bound)
We write f(n)= W(g(n)) if there are positive constants no and c such that to the
right of no the value of f(n) always lies on or above c.g(n).
Represent the amount of time the algorithm takes on the smallest possible set of
inputs.
IV. o-notation (little o) - an upper bound that is not asymptotically tight.
• Big O-Notation denotes an upper bound that may or may not be asymptotically
tight.
• f(n)=o(g(n)), if there are positive constants no and c such that to the right of no,
the value of f(n) lies below c.g(n).
V. w-notation (little omega) - a lower bound that is not asymptotically tight.
• f(n)=w(g(n)), if there are positive constants no and c such that to the right of no,
the value of f(n) always lies above c.g(n).
5
University of Gondar Design and Analysis of Algorithms
1.5.1. Heaps
Heap is a specialized, binary tree-based data structure, which is essentially an almost
complete tree that satisfies the following heap property:
1. Max-Heap: In a Max-Heap, the key present at the root node must be greatest
among the keys present at all of its children. The same property must be
recursively true for all sub-trees in that Binary Tree.
2. Min-Heap: In a Min-Heap, the key present at the root node must be the
minimum among the keys present at all of its children. The same property must
be recursively true for all sub-trees in that Binary Tree.
1.5.2. Hashing
Hashing is a technique or process of mapping keys, and values into the hash table by using
a hash function. It is done for faster access to elements. The efficiency of mapping depends
on the efficiency of the hash function used.
Index = hash(key)
For example, if the key value is 6 and the size of the hash table is 10. When we apply a
division hash function to key 6 then the index would be:
The disjoint set can be defined as the subsets where there is no common element between
the two sets.
6
University of Gondar Design and Analysis of Algorithms