1 Summary on Θ (n) Sorting Algorithms (Section 7.2) : Lecture Seven
1 Summary on Θ (n) Sorting Algorithms (Section 7.2) : Lecture Seven
Winter 2015
ITEC2620
Insertion Sort
2.1
Best-case Complexity
Let tj be the number of times the while loop in Line 4 is executed for that value of j, for
each j = 2, 3, . . . , n, where n = lenght[A], times of executions for line 4 is
n
X
tj = t2 + t3 + . . . + tn .
j=2
If Line 4 is executed tj times, then each of Line 5 and Line 6 will be executed (tj 1) times.
n
n
P
P
That is, times of executions for Line 5 is
(tj 1), and for Line 6 is also
(tj 1). For the
j=2
j=2
best case, we know that the array is already sorted. For each j = 2, 3, . . . , n, we then find
that A[i] key in Line 4 when i has its initial value of j 1. Thus, tj = 1 for j = 2, 3, . . . , n
and the best-case of the times of executions is
T (n) = 3(n 1) + n +
n
X
j=2
= 4n 3 +
n
X
n
n
X
X
tj + 2
(tj 1) note that
(tj 1) = 0
j=2
j=2
j=2
= 4n 3 + n 1
= 5n 4.
Thus T (n) is a linear function, i.e., T (n) O(n).
1
Lecture Seven
Winter 2015
2.2
ITEC2620
Worst-case Complexity
For worst-case, the array is in reversely sorted order (decreasing order). We must compare
each element A[j] with each element in the entirely sorted subarray A[1 . . . j 1], and so
tj = j for j = 2, 3, . . . , n. Hence, the worst-case running time is
T (n) = 3(n 1) + n +
n
X
n
X
tj + 2
(tj 1)
j=2
= 4n 3 +
n
X
j+2
j=2
j=2
n
X
(j 1)
j=2
(n 1)n
n(n + 1)
1 + 2(
)
2
2
n2 n
+ 1 + n2 n
= 4n 3 +
2
2
3n2 7n
=
+
4
2
2
= 4n 3 +
We first review what is lower bounds and upper bound. Remember that worst-case running time T (n) = max{t(x) : x has size n}. An upper bound is a function g(n) s.t.
f (n) O(g(n)) (essentially, T (n) g(n)). A lower bound on T (n) is a function f (n)
s.t. T (n) (f (n) (essentially, T (n) f (n)).Generally, to prove that g(n) is an upper
bound on T (n), we argue that for all x of size n, t(x) g(n). Generally, to prove that f (n)
is a lower bound on T (n), we exhibit one specific input x, for which we can show t(x) f (n).
The average-case running time of an algorithm will always be less than or equal to the worstcase by definition. we have already shown an example (ListSearch in Lecture Two) where the
two are asymptotically the same. As we will see below, that they can also be asymptotically
different, with the average-case becoming much better.
QuickSort(S) (S is the list to sort)
1: if |S| <=1 return S
2: select pivot p in S
3: partition elements of S into:
L = elements of S less than p
E = elements of S equal to p
U = elements of S greater than p
4: return QuickSort(L); E; QuickSort(U)
Standard QuickSort: we select the pivot by simply letting p be the first element in S.
Example 1. S = [15, 25, 12, 2, 37].
p = 15, L =[12, 2]; M =15; U =[25, 37]
p = 12, L=[2]; M=12; U=[];
p = 25, L=[]; M=[25]; U=[37];
2, 12, 15, 25, 37.
2
Lecture Seven
Winter 2015
3.1
ITEC2620
Worst-case Complexity
n(n1)
.
2
3.2
2X 0
T (j) ()
T (n) = n 1 +
n j=0
0
n2
2 X 0
T (n 1) = n 2 +
T (j)()
n 1 j=0
0
Lecture Seven
Winter 2015
ITEC2620
n1
nT 0 (n) = n(n 1) + n
2X 0
T (j)(from )
n j=0
n2
2 X 0
(n 1)T (n 1) = (n 1)(n 2) + (n 1)
T (j)(from )
n 1 j=0
0
T 0 (n)
,
n+1
we have
A(n) = A(n 1) +
and
A(0) =
2(n 1)
n(n + 1),
T 0 (0)
= 0.
1
hence,
A(n) =
n
X
2(i 1)
i=1
=2
i(i + 1)
n
X
i=1
n
X
i=1
X
2i
2
X 1
1
1
2
(
)
(i + 1)
i
i
+
1
i=1
= 2(log n) 2(1
1
= (log n).
n + 1)
Hence,
T 0 (n) = (n 1)A(n) = (n log n).
Note that, The sum of reciprocals from 1 to n, is called the Harmonic Series and written as
Hn , has a value of loge n + O(1).
3.3
Randomized QuickSort