Complexity
Complexity
Complexity of Algorithms
An algorithm must always produce the correct answer, and
should be efficient.
How can the efficiency of an algorithm be analyzed?
The algorithmic complexity of a computation is, most
generally, a measure of how difficult it is to perform the
computation.
That is, it measures some aspect of the cost of
computation (in a general sense of “cost”).
Amount of resources required to do a computation.
13-4
Example: Max algorithm
Problem:
Find the simplest form of the exact order
of growth () of the worst-case time complexity
of the max algorithm,
assuming that each line of code takes some
constant time every time it is executed
(with possibly different times for different lines
of code).
Complexity Analysis of max
procedure max(a1, a2, …, an: integers)
v := a1 t1
for i := 2 to n t2
if ai v then v := ai t3
return v t4
t2
Linear Search Analysis
Worst case time complexity:
t(n) t11 t12 t2
# of (n 1) n 1 2n 2 (n)
comparisons
Best case: t(n) t11 t12 t2 111
Average case,(1)
if item is present:
if x am then i := m + 1 else j := m t2
end
if x = ai then location := i else location := 0 t3
return location
Binary Search Analysis
Suppose that n is a power of 2, i.e., k: n = 2k.
Original range from i = 1 to j = n contains n items.
Each iteration: Size j - i + 1 of range is cut in half.
Size decreases as 2k, 2k–1, 2k–2,…
Loop terminates when size of range is 1 = 20 (i = j).
Therefore, the number of iterations is: k = log2n
t(n) t1 t2 t3
(k 1) k 1 2k 2 2 log2 n 2 (log2 n)
The total number of comparisons used by the bubble sort to order a list of n
elements is:
The total number of comparisons used by the insertion sort to sort a list of n
elements is:
Both the insertion sort an bubble sort has worst-case complexity Θ(n2).
Bubble Sort Analysis
procedure bubble_sort (a1, a2, …, an: real numbers
with n 2)
for i := 1 to n – 1
for j := 1 to n – i
if aj aj+1 then interchange aj and aj+1
{a1, a2, …, an is in increasing order}
(1.25 bytes)
#ops(n) (125 n
kB)=
10 n = 106
log2 n 3.3 ns 19.9 ns
n 10 ns 1 ms
n log2 2 n 33 ns 19.9 ms
n 100 ns 16 301,004.5
m 40 s
2n 10
1.024 s
n! 3.63 ms Ouch!
Review: Complexity
Algorithmic complexity = cost of computation.
Focus on time complexity for our course.
Although space & energy are also important.
Characterize complexity as a function of input size:
Worst-case, best-case, or average-case.
Use orders-of-growth notation to concisely summarize the
growth properties of complexity functions.
Need to know
Names of specific orders of growth of complexity.
How to analyze the order of growth of time
complexity for simple algorithms.
Tractable vs. Intractable
A problem that is solvable using an algorithm with at
most polynomial time complexity is called tractable
(or feasible).
P is the set of all tractable problems.
A problem that cannot be solved using an algorithm
with worst-case polynomial time complexity is called
intractable (or infeasible).
Note that n1,000,000 is technically tractable, but really
very hard.nlog log log n is technically intractable, but
easy. Such cases are rare though.
P vs. NP
NP is the set of problems for which there exists
a tractable algorithm for checking a proposed
solution to tell if it is correct.