0% found this document useful (0 votes)
48 views2 pages

Big-O Notation Analysis of Algorithms

1. The document discusses Big-O notation and time complexity analysis of algorithms. Big-O notation characterizes how fast an algorithm grows relative to the size of its input. (O(1) is constant time, O(n) is linear time, O(n^2) is quadratic time, etc.) 2. Common time complexities are discussed, along with examples of algorithms that fall into each category. O(1) is best, while exponential time O(a^n) is worst. Polynomial time is generally considered more manageable than exponential time. 3. The base of a logarithm doesn't affect its Big-O classification, so O(log n) and O(

Uploaded by

teddy haile
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views2 pages

Big-O Notation Analysis of Algorithms

1. The document discusses Big-O notation and time complexity analysis of algorithms. Big-O notation characterizes how fast an algorithm grows relative to the size of its input. (O(1) is constant time, O(n) is linear time, O(n^2) is quadratic time, etc.) 2. Common time complexities are discussed, along with examples of algorithms that fall into each category. O(1) is best, while exponential time O(a^n) is worst. Polynomial time is generally considered more manageable than exponential time. 3. The base of a logarithm doesn't affect its Big-O classification, so O(log n) and O(

Uploaded by

teddy haile
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as RTF, PDF, TXT or read online on Scribd
You are on page 1/ 2

Big-O Notation

Analysis of Algorithms
(how fast does an algorithm grow with respect to N)
(Note: Best recollection is that a good bit of this document comes from C++ For You++, by Litvin & Litvin)

The time efficiency of almost all of the algorithms we have discussed can be characterized by only a few growth rate functions:

9. O(l) - constant time


This means that the algorithm requires the same fixed number of steps regardless of the size of the task.

Examples (assuming a reasonable implementation of the task):

1. Push and Pop operations for a stack (containing n elements);


2. Insert and Remove operations for a queue.

35. O(n) - linear time


This means that the algorithm requires a number of steps proportional to the size of the task.

Examples (assuming a reasonable implementation of the task):

1. Traversal of a list (a linked list or an array) with n elements;


2. Finding the maximum or minimum element in a list, or sequential search in an unsorted list of n elements;
3. Traversal of a tree with n nodes;
4. Calculating iteratively n -factorial; finding iteratively the n th Fibonacci number.

61. O(n2) - quadratic time


The number of operations is proportional to the size of the task squared.

Examples:

1. Some more simplistic sorting algorithms, for instance a selection sort of n elements;
2. Comparing two two-dimensional arrays of size n by n;
3. Finding duplicates in an unsorted list of n elements (implemented with two nested loops).

IV. O(log n) - logarithmic time

Examples:

1. Binary search in a sorted list of n elements;


2. Insert and Find operations for a binary search tree with n nodes;
3. Insert and Remove operations for a heap with n nodes.

22. O(n log n) - "n log n " time

Examples:

1. More advanced sorting algorithms - quicksort, mergesort

VI. O(an) (a > 1) - exponential time


Examples:

1. Recursive Fibonacci implementation


2. Towers of Hanoi
3. Generating all permutations of n symbols
The best time in the above list is obviously constant time, and the worst is exponential time which, as we have seen,
quickly overwhelms even the fastest computers even for relatively small n. Polynomial growth (linear, quadratic, cubic,
etc.) is considered manageable as compared to exponential growth.

Order of asymptotic behavior of the functions from the above list:


Using the "<" sign informally, we can say that

O(l) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(an)

A word about O(log n) growth:

As we know from the Change of Base Theorem, for any a, b > 0, and a, b  1
log a
log a b
logb n = n
statement(s) S
Therefore,
{run through another loop n
loga n = C logb n times}
}
where C is a constant equal to loga b.

Since functions that differ only by a constant


factor have the same order of growth, O(log2
n) is the same as O(log n). Therefore, when
we talk about logarithmic growth, the base of Total Execution Time:
the logarithm is not important, and we can
say
therefore,
simply O(log n).

A word about Big-O when a function is the


sum of several terms:

If a function (which describes the order of growth of an


algorithm) is a sum of several terms, its order of growth
is determined by the fastest growing term. In particular,
if we have a polynomial

k k-1
p(n) = akn + ak-1n + … + a1n + a0

its growth is of the order n k:

k
p(n) = O(n )

Example:

{perform any statement S1}


O(1)
for (i=0; i < n; i++)
{

{perform any

You might also like