Analysis and Complexity of Algorithms
Analysis and Complexity of Algorithms
Algorithms
1 14-04-04
Analysis of Algorithms
Quantifying the resources required.
Measures of resource utilization (efficiency):
Execution time time complexity
Memory space space complexity
Observation :
The larger the input data the more the resource
requirement: Complexities are functions of the
amount of input data (input size).
Refer to first few chapters of the book “Introduction
To Algorithms” by Cormen, Leiserson, Rivest and Stein
2 14-04-04
Yardstick?
The same algorithm will run at different speeds
and will require different amounts of space when
run on different computers, different
programming languages, different compilers.
Algorithms usually consume resources in some
fashion that depends on the size of the problem
they solve.
Need a machine independent complexity
measure that is a function of input size, n. (Also
we are interested at asymptotic behaviour, that
is, when n becomes large.
3 14-04-04
Analyzing SelSort
int max_loc(int x[], int k, int size)
{ int j, pos; Computer 1:
pos = k; f1(n) = 0.0007772 n2 + 0.00305 n +
for (j=k+1; j<size; j++) 0.001
if (x[j] > x[pos]) Computer 2:
pos = j;
f2(n) = 0.0001724 n2 + 0.00040 n +
return pos;
0.100
}
int selsort (int x[], int size) { Note: Both are quadratic functions
int k, m, temp; of n
for (k=0; k<size-1; k++) { The shape of the curve that
m = max_loc(x, k, size); expresses the running time as a
temp = x[k]; function of the problem size
x[k] = x[m]; stays the same.
x[m] = temp; We say that Selection Sort is of
} complexity Order n2 or O(n2)
}
4 14-04-04
Complexity classes
The running time for different algorithms
fall into different complexity classes.
Each complexity class is characterized by a
different family of curves.
All curves in a given complexity class share
the same basic shape. (In the Asymptotic
sense)
The O-notation is used for talking about
the complexity classes of algorithms.
5 14-04-04
Order of Complexity, O- notation
For the quadratic function
f(n) = an2 + bn + c
we will say that f(n) is O(n2).
We focus on the dominant term, and ignore the lesser
terms; then throw away the coefficient. [Asymptotic
Analysis]
Since constants are finally not important we may
assume that each machine instruction takes one unit of
time when we analyze the complexity of an algorithm.
We may sometimes abstract this to unit time operators
like add, sub, multiply, etc and do an operator count.
We can perform worst case or average case analysis.
6 14-04-04
How execution time is affected by various
complexity measures:
Assume speed S is 107 instructions per second.
size 10 20 30 50 100 1000 10000
C
O n .001 .002 .003 .005 .01 .1 ms 1 ms
ms ms ms ms ms
M
nlogn .003 .008 .015 .03 .07 1 ms 13 ms
P ms ms ms ms ms
L 2 .01 .04 .09 .25 1 ms 100 10 s
n
E ms ms ms ms ms
X 3 .1 .8 2.7 12.5 100 100 s 28 h
n
ms ms ms ms ms
I
n .1 .1 s 100 s 3 y 3x inf inf
T 2 ms 13
10 c
Y
7 14-04-04
Maximum size solvable within 1 hour
n N1 = 100 N1 1000 N1
10
3.6x10
n log n N2 = 85 N2 750 N2
9
1.2x10
2 N3 = 10 N3 30 N3
n 5
2x10
n N4 = 35 N4+7 N4+10
2
8 14-04-04
Formal Definition
T(N) = O(f(N)) if there are positive constants c and n0 such
that T(N) c f(N) when N n0.
Meaning : As N increases, T(N) grows no faster
than f(N).
The function T is eventually bounded by some multiple of
f(N). f(N) gives an upper bound in the behavior of T(N).
logen = O(n)
n2 = O(n2).
n2 = O(n3).
nlogn = O(n2).
3 n2 + 5n + 1 = O(n2)
9 14-04-04
Some Worst Case Complexities:
Linear Search in an array: ?
Binary Search: ?
Selection Sort: ?
Mergesort: ?
Quicksort: ?
Stack ADT: Push and Pop are ?
Queue using a singly linked list: ?? (What about a
doubly linked list??)
Multiplying 2 square matrices:?
The primes that we wrote in class:?
10 14-04-04
Some Worst Case Complexities:
Linear Search in an array: O(n)
Binary Search: O(log n)
Selection Sort: O(n2)
Stack ADT: Push and Pop are O(1)
Queue using a singly linked list:
Insert (Enqueue): O(1)
Delete (Dequeue): O(n)
11 14-04-04
More definitions
T(N) = (g(N)) if there are positive constants c and n0 such that
T(N) c f(N) when N n0.
Meaning : As N increases, T(N) grows no slower than g(N) ;
T(N) grows at least as fast as g(N).
T(N) = (h(N)) if and only if T(N) = O (h(N)) and T(N) = (h(N))
Meaning : As N increases, T(N) grows as fast as h(N).
T(N) = o(p(N))
Meaning : As N increases, T(N) grows slower than p(N).
lim nT(N)/p(N) = 0.
logen = O(n)
n10 = o(2n)
3 n2 + 5n + 1 = (n2)
12 14-04-04
Space Complexity
Measures the work space or space required in addition to
input storage
Selection Sort: O(1)
Binary Search:
is of depth log n}
Quicksort: O(n) {Why: recursion stack in the worst case}
13 14-04-04