Introduction:
Basic Concepts and Notations
Complexity analysis: time space tradeoff
Algorithmic notations, Big O notation
Introduction to omega, theta and little o notation
Basic Concepts and Notations
Algorithm: Outline, the essence of a computational
procedure, step-by-step instructions
Program: an implementation of an algorithm in some
programming language
Data Structure: Organization of data needed to
solve the problem
Classification of Data Structures
Data Structures
Primitive Non-Primitive
Data Structures Data Structures
• Integer Linear Non-Linear
• Real Data Structures Data Structures
• Character • Array • Tree
• Boolean • Stack • Graph
• Queue
• Linked List
Data Structure Operations
Data Structures are processed by using certain operations.
1.Traversing: Accessing each record exactly once so that certain
items in the record may be processed.
2.Searching: Finding the location of the record with a given key
value, or finding the location of all the records that satisfy one or
more conditions.
3.Inserting: Adding a new record to the structure.
4.Deleting: Removing a record from the structure.
Special Data Structure-
Operations
• Sorting: Arranging the records in some logical order
(Alphabetical or numerical order).
• Merging: Combining the records in two different sorted
files into a single sorted file.
Algorithmic Problem
Specification
Specification of output as a
of input ? function of
input
Infinite number of input instances satisfying the
specification. For example: A sorted, non-decreasing
sequence of natural numbers of non-zero, finite
length:
1, 20, 908, 909, 100000, 1000000000.
3.
Algorithmic Solution
Specification
Specification of output as a
Algorithm
of input function of
input
Algorithm describes actions on the input instance
Infinitely many correct algorithms for the same
algorithmic problem
What is a Good Algorithm?
Efficient:
Running time
Space used
Efficiency as a function of input size:
The number of bits in an input number
Number of data elements(numbers, points)
Complexity analysis
Why we should analyze algorithms?
Predict the resources that the algorithm requires
Computational time (CPU consumption)
Memory space (RAM consumption)
Communication bandwidth consumption
The running time of an algorithm is:
The total number of primitive operations executed (machine
independent steps)
It is a determination of order of magnitude of statement.
Also known as algorithm complexity
Time Complexity
Worst-case
An upper bound on the running time for any input of
given size
Average-case
Assume all inputs of a given size are equally likely
Best-case
The lower bound on the running time
Time Complexity – Example
Sequential search in a list of size n
Worst-case:
n comparisons
Best-case:
1 comparison
Average-case:
n/2 comparisons
time space tradeoff
A time space tradeoff is a situation where the memory use
can be reduced at the cost of slower program execution
(and, conversely, the computation time can be reduced at
the cost of increased memory use).
A space-time or time-memory tradeoff is a way of
solving a problem or calculation in less time by using
more storage space (or memory), or by solving a problem
in very little space by spending a long time.
time space tradeoff
As the relative costs of CPU cycles, RAM space, and hard
drive space change—hard drive space has for some time
been getting cheaper at a much faster rate than other
components of computers—the appropriate choices for
time space tradeoff have changed radically.
Often, by exploiting a time space tradeoff, a program can
be made to run much faster.
Asymptotic notations
Algorithm complexity is rough estimation of
the number of steps performed by given
computation depending on the size of the
input data
Measured through asymptotic notation
O(g) where g is a function of the input data size
Examples:
Linear complexity O(n) – all elements are processed once (or
constant number of times)
Quadratic complexity O(n2) – each of the elements is
processed n times
Big-O Notation (O)
Asymptotic Upper Bound
Omega Notation (Ω)
Asymptotic Lower Bound
Theta Notation (Θ)
g(n) is an asymptotically tight bound of f(n)
little o notation
ω-notation
Big O notation
f(n)=O(g(n)) iff there exist a positive constant c and
non-negative integer n0 such that
f(n) cg(n) for all nn0.
g(n) is said to be an upper bound of f(n).
Basic rules
1. Nested loops are multiplied together.
2. Sequential loops are added.
3. Only the largest term is kept, all others are
dropped.
4. Constants are dropped.
5. Conditional checks are constant (i.e. 1).
Example 1
//linear
for(int i = 0; i < n; i++) {
cout << i << endl;
}
Ans: O(n)
Example 2
//quadratic
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++){
//do swap stuff, constant time
}
}
Ans O(n^2)
Example 3
for(int i = 0; i < 2*n; i++) {
cout << i << endl;
}
At first you might say that the upper bound is O(2n);
however, we drop constants so it becomes O(n)
Example 4
//linear
for(int i = 0; i < n; i++) {
cout << i << endl;
}
//quadratic
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++){
//do constant time stuff
}
}
Ans : In this case we add each loop's Big O, in this
case n+n^2. O(n^2+n) is not an acceptable answer
since we must drop the lowest term. The upper bound
is O(n^2). Why? Because it has the largest growth
rate
Example 5
for(int i = 0; i < n; i++) {
for(int j = 0; j < 2; j++){
//do stuff
}
}
Ans: Outer loop is 'n', inner loop is 2, this we have 2n,
dropped constant gives up O(n)
Example 6
x=y+z;
for(i=1; i<= n; i++)
x=y+z;
for(i=1; i<=n/2; i++)
for(j=1; j<=n; j++)
x=y+z;
Example 7
x=y+z;
for(i=1; i<= n; i++)
x=y+z;
for(i=1; i<=n; i++)
for(j=1; j<=n; j++)
x=y+z;
a=b+c;
Example 8
while(n>1)
{
n=n-1;
a=b+c;
}
Example 10
while(n>=1)
{
n=n-20;
n=n-5;
n=n-2;
}
Example 11
while(n>=1)
{
n=n-20;
n=n+5;
n=n-30;
}
Example 12
while(n>=1)
{
n=n/2;
}
Example 13
while(n>=1)
{
n=n/2;
n=n/3;
}
Example 14
while(n>=1)
{
n=n-2;
n=n/2;
}
Example 15
for(int i = 1; i < n; i *= 2) {
cout << i << endl;
}
There are n iterations, however, instead of simply
incrementing, 'i' is increased by 2*itself each run.
Thus the loop is log(n).
Example 16
for(int i = 0; i < n; i++) { //linear
for(int j = 1; j < n; j *= 2){ // log (n)
//do constant time stuff
}
}
Ans: n*log(n)
Example 17
while(n>2)
{
n=√n;
}
Example 18
while(n>2)
{
n=n2;
n=√n;
n=n-2;
}
Example 19
x=y+z;
for(i=1; i<= n; i++)
{
j=2;
while(j<=n)
{
j=j2;
}
}
Comp 122
Thank You