0% found this document useful (0 votes)
57 views

COMP2230 Introduction To Algorithmics: Lecture Overview

This document summarizes a lecture on algorithm analysis. It introduces recurrence relations and uses examples like the Fibonacci sequence and Towers of Hanoi to illustrate recurrence relations. It also covers asymptotic analysis notation like Big-O, defining various time complexities like constant, logarithmic, linear and exponential. Common growth functions are provided along with examples of analyzing time complexities for different algorithms.

Uploaded by

MrZaggy
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

COMP2230 Introduction To Algorithmics: Lecture Overview

This document summarizes a lecture on algorithm analysis. It introduces recurrence relations and uses examples like the Fibonacci sequence and Towers of Hanoi to illustrate recurrence relations. It also covers asymptotic analysis notation like Big-O, defining various time complexities like constant, logarithmic, linear and exponential. Common growth functions are provided along with examples of analyzing time complexities for different algorithms.

Uploaded by

MrZaggy
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

COMP2230 Introduction to Algorithmics Lecture 2

A/Prof Ljiljana Brankovic

Lecture Overview

Maths Review - examples Analysis of Algorithms - Text, Section 2.3 Recurrence Relations Text, Section 2.4

Binomial Coefficients
n k n! (n-k)!k! 0

n(n-1)(n-2) . . . (n-k+1) k! 1
1 2 3 4 5

S = {1,2,3,4,5} The k-subsets of The set S:

2
12 13 23 14 24 34 15 25 35 45

3
123 124 134 234 125 135 235 145 245 345

4
1234 1235 1245 1345 2345

5
12345

Knigsberg Bridges
(solved by the Swiss mathematician Leonard Euler (1707-1783) in 1736)

Can you cross all the bridges exactly once and return to the starting point?

Icosian Game (A Voyage Around the World )


(invented by the Irish mathematician Sir William Hamilton, 1805-1865) Find Hamiltonian cycle.

Analysis of Algorithms
Analysis of algorithms refers to estimating the time and space required to execute the algorithm. We shall be mostly concerned with time complexity. Problems for which there exist polynomial time algorithms are considered to have an efficient solution.

Analysis of Algorithms
Problems can be:
Unsolvable
These problems are so hard that there does not exist an algorithm that can solve them. Example is the famous Halting problem.

Intractable
These problems can be solved, but for some instances the time required is exponential and thus these problems are not always solvable in practice, even for small sizes of input.

Analysis of Algorithms
Problems can be (continued):
Problems of unknown complexity
These are, for example, NP -complete problems; for these problems neither there is a known polynomial time algorithm, nor they have been shown to be intractable.

Feasible (tractable) problems


For these problems there is a known polynomial time algorithm.

Analysis of Algorithms
We shall not try to calculate the exact time needed to execute an algorithm; this would be very difficult to do, and it would depend on the implementation, platform, etc. We shall only be estimating the time needed for algorithm execution, as a function of the size of the input. We shall estimate the running time by counting some dominant instructions. A barometer instruction is an instruction that is executed at least as often as any other instruction in the algorithm.

Analysis of Algorithms
Worst-case time is the maximum time needed to execute the algorithm, taken over all inputs of size n. Average-case time is the average time needed to execute the algorithm, taken over all inputs of size n. Best-case time is the minimum time needed to execute the algorithm, taken over all inputs of size n.

Analysis of Algorithms
It is usually harder to analyse the average-case behaviour of the algorithm than the best and the worst case. Behaviour of the algorithm that is the most important depends on the application. For example, for algorithm that is controlling a nuclear reactor, the worst case analysis is clearly very important. On the other hand, for an algorithm that is used in a noncritical application and runs on a variety of different inputs, the average case may be more appropriate.

Example 2.3.1 Finding the Maximum Value in an Array Using a While Loop
This algorithm finds the largest number in the array s[1], s[2], ... , s[ n]. Input Parameter: s Output Parameters: None array_max_ver1(s) { large = s [1] i = 2 while (i = s. last ) { if (s [i] > large) large = s [i] // larger value found i = i + 1 } return large }

The input is an array of size n. What can be used as a barometer ? The number of iterations of while loop appears to be a reasonable estimate of execution time - The loop is always executed n-1 times. Well use the comparison i = s.last as a barometer. The worst-case and the average-case (as well as the best case!) times are the same and equal to n.

Suppose that the worst-case time of an algorithm is (for input of size n) t(n) = 60n2 + 5n + 1 For large n, t(n) grows as 60n2: n 10 100 1000 T(n)=60n2 + 5n + 1 6,051 600,501 60,005,001 60n2 6,000 600,000 60,000,000 6,000,000,000

10,000 6,000,050,001

The constant 60 is actually not important as it does not affect the growth of t(n) with n. We say that t(n) is of order n2, and we write it as t(n) = (n2) (t(n) is theta of n 2)

Let f and g be nonnegative functions on the positive integers. Asymptotic upper bound: f(n) =O(g(n)) if there exist constants C1 > 0 and N1 such that f(n) C1 g(n) for all n N1 . We read f(n) =O(g(n)) as f(n) is big oh of g(n)) or f(n) is of order at most g(n).

C1g(n) f(n)

N1

Asymptotic lower bound: f(n) = (g(n)) if there exist constants C2 > 0 and N2 such that f(n) C2 g(n) for all n N2 . We read f(n) = (g(n)) as f(n) is of order at least g(n) or f(n) is omega of g(n)).

f(n) C2g(n)

N2

Asymptotic tight bound: f(n) =(g(n)) if there exist constants C1 ,C2 > 0 and N such that C2 g(n) f(n) C1 g(n) for all n N. Equivalently, f(n) =(g(n)) if f(n) =O(g(n)) and f(n) =(g(n)) ). We read f(n) =(g(n)) as f(n) is of order g(n) or f(n) is theta of g(n)) Note that n=O(2n) but n (2n).
C1g(n) f(n) C2g(n)

Example
Let us now formally show that if t(n) = 60n2 + 5n + 1 then t(n) = (n2 ). Since t(n) = 60n2 + 5n + 1 60n2 + 5n2 + n2 = 66n2 for all n 1 it follows that t(n) = 60n2 + 5n + 1 = O(n2 ). Since t(n) = 60n2 + 5n + 1 60n2 for all n 1 it follows that t(n) = 60n2 + 5n + 1 = (n2 ). Since t(n) = 60n2 + 5n + 1 = O(n2 ) and t(n) = 60n2 + 5n + 1 = (n2 ) it follows that t(n) = 60n2 + 5n + 1 = (n2 ).

Asymptotic Bounds for Some Common Functions


Let p(n) = aknk + ak-1nk-1 + + a1n +a0 be a nonnegative polynomial (that is, p(n) 0 for all n) in n of degree k. Then p(n) = (nk). logb n = (loga n) log n! = (n log n)
(because logbn =loga n/logab)

(because log n! =log n+log(n-1)++log2+log1))

i=1n (1/i) = (log n)

Common Asymptotic Growth Functions


Theta form
(1) (log log n) (log n) (nc), (n) (n log n) (n2 ) (n3 ) (nk), (cn), (n!) k1 c>1 0<c<1

Name
Constant Log log Log Sublinear Linear n log n Quadratic Cubic Polynomial Exponential Factorial

The running time of different algorithms on a processor performing one million high level instructions per second (from Algorithm Design, by J. Kleinberg and E Tardos).

n
10 30 50 100 1,000 10,000 100,000

n
< 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec

nlogn
< 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 2 sec 20 sec

n2
< 1 sec < 1 sec < 1 sec < 1 sec 1 sec 2 min

n3
< 1 sec < 1 sec < 1 sec 1 sec 18 min 12 days

1.5n
< 1 sec < 1 sec 11 min 12,892 y. >1025 y. >1025 y. >1025 y. >1025 y.

2n
< 1 sec 18 min 36 years 1017 y. >1025 y. >1025 y. >1025 y. >1025 y.

n!
4 sec 1025 y. >1025 y. >1025 y. >1025 y. >1025 y. >1025 y. >1025 y.

3 hours 32 years 12 days 31,710 y.

1,000,000 1 sec

Properties of Big Oh
If f(x) = O(g(x)) and h(x) = O(g(x)) then f(x)+h(x) = O( max { g(x),g(x) } ) and f(x) h(x) = O( g(x) g(x) )

If

f(x) = O(g(x)) and g(x) = O(h(x)) then f(x) = O( h(x) )

Examples
1. 5 n2 1000 n + 8 = O(n2) ? 2. 5 n2 1000 n + 8 = O(n3) ? 3. 5 n2 + 1000 n + 8 = O(n) ? 4. nn = O(2n) ? 5. If f(n) = (g(n)), then 2f(n) = (2g(n)) ? 6. If f(n) = O(g(n)), then g(n) = (f(n)) ?

Examples
1. 2. 3. 5 n2 1000 n + 8 = O(n2) ? True 5 n2 1000 n + 8 = O(n3) ? True 5 n2 + 1000 n + 8 = O(n) ? False n2 grows faster than n nn = O(2n) ? False nn grows faster than 2n If f(n) = (g(n)), then 2f(n) = (2g(n)) ? False here constants matter, as they become exponents !!! E.g., f(n) = 3n, g(n) = n, 23n (2 n) If f(n) = O(g(n)), then g(n) = (f(n)) ? True if f(n) Cg(n) for n N, then g(n) f(n)/C for n N.

4.

5.

6.

Examples
7. 8. 9. If f(n) = (g(n)), and g(n) = (h(n)), then f(n) + g(n) = (h(n)) ? If f(n) = O(g(n)), then g(n) = O(f(n)) ? If f(n) = (g(n)), then g(n) = (f(n)) ?

10. f(n) + g(n) = (h(n)) where h(n) = max {f(n), g(n)} ? 11. f(n) + g(n) = (h(n)) where h(n) = min {f(n), g(n)} ?

Examples
7. If f(n) = (g(n)), and g(n) = (h(n)), then f(n) + g(n) = (h(n)) ? True If f(n) = O(g(n)), then g(n) = O(f(n)) ? False e.g., f(n) = n, g(n) = n2 If f(n) = (g(n)), then g(n) = (f(n)) ? True

8.

9.

10. f(n) + g(n) = (h(n)) where h(n) = max {f(n), g(n)} ? True 11. f(n) + g(n) = (h(n)) where h(n) = min {f(n), g(n)} ? False f(n) = n, g(n) = n2 , h(n) n

Examples find theta notation for the following


12. 6n+1 13. 2 lg n + 4n + 3n lg n 14. (n2 + lg n)(n + 1) / (n + n2 ) 15. for i=1 to 2n for j=1 to n x=x+1
16. i=n while (i 1) { x=x+1 i=i/2 } 17. j=n while (j 1) { for i=1 to j x=x+1 j=j/2 }

Examples find theta notation for the following


12. 13. 14. 6n+1 = (n) 2 lg n + 4n + 3n lg n = (n lg n) (n2 + lg n)(n + 1) / (n + n2) = (n) 16. i=n while (i 1) { x=x+1 i=i/2 } = (lg n) 17. j=n while (j 1) { for i=1 to j x=x+1 j=j/2 } = (n)

15. for i=1 to 2n for j=1 to n x=x+1 = (n2)

Recurrence Relations
A recurrence relation for the sequence a0 , a1, is an equation that relates an to certain of its predecessors a0 , a1 , , an-1 . Initial conditions for the sequence a0 , a1 , are explicitly given values for the finite number of terms of the sequence. Example: Fibonacci sequence f n = fn-1 + fn-2 , n 3
Initial conditions: f1 = f2 = 1.

Example: Fibonacci sequence


Suppose that at the beginning of the year there is one pair of rabbits and that every month each pair produces a new pair that becomes productive after one month. Suppose further that no deaths occur. Let an denote the number of rabbits at the end of nth month. Show that an = fn+1 , n 1.
Months 1 2 3 n-2 n-1 n Pairs 1 2 3 an-2 an-1 an-1 + an-2 Each pair that was alive 2 months ago reproduced Comment The pair did not multiply because it is not productive yet. The first pair became productive and multiplied Only first pair multiplied

Example: Towers of Hanoi


After creating the world, God set on Earth 3 diamond rods and 64 golden rings, all of different size. All the rings were initially on the first rod, in order of size, the smallest at the top. God also created a monastery nearby where monks task in life is to transfer all the rings onto the second rod; they are only allowed to move a single ring from one rod to another at the time, and a ring can never be placed on top of another smaller ring. According to the legend, when monks have finished their task, the world will come to an end. If monks move one ring per second and never stop, it will take them more that 500,000 million years to finish the job (more than 25 times the estimated age of the universe!).

Example: Towers of Hanoi


The following is a way to move 3 rings from rod 1 to rod 2.

Example: Towers of Hanoi


The following is an algorithm that moves m rings from rod i to rod j.
Hanoi (m,i,j){ \\Moves the m smallest rings from rod i to rod j if m > 0 then Hanoi(m-1,i,6-i-j) write i j Hanoi(m-1,6-i-j,j) }

Example: Towers of Hanoi


Recurrence relation:
0 t(m) = 2t(m-1) + 1 otherwise if m=0

Or, equivalently,
t0 = 0 tn = 2tn-1 + 1, n > 0

Example 2.4.3
example(n ) { if (n == 1) return for i = 1 to n x = x + 1 example(n /2) } Recurrence relation: C1 = 0, Cn = n + Cn/2, n>1

Solving Recurrence Relations


Iteration
(substitution) : cn = cn-1 + n, n = 1 c0 = 0 cn-1 = cn-2 + (n-1), cn-2 = cn-3 + (n-2) cn = cn-1 + n = cn-2 + (n-1) + n = cn-3 + (n-2) + (n-1) + n = c0 + 1 + 2 + 3 + + (n-2) + (n-1) + n = n(n+1)/2

Solving Recurrence Relations


Iteration (substitution) cn = n + cn/2, n > 1 c1 = 0 Solution for n=2k , for some k. c(2k ) = 2k + c(2k-1 ) = 2k + 2k-1 + c(2k-2) = 2k + 2k-1 + + 21 + c(20) = 2k+1 2 = 2n-2 = (n).

You might also like