0% found this document useful (0 votes)
2 views

Lec04 Divide and Conquer (Part 1)

The document discusses the divide-and-conquer algorithm design technique, which involves dividing a problem into smaller subproblems, solving them recursively, and then merging their solutions. It specifically focuses on the Mergesort algorithm, detailing its steps of dividing, recursively sorting, and merging sequences, along with its implementation and performance characteristics. Mergesort is exemplified through a series of recursive calls and merges, illustrating how it sorts an initial sequence into a final sorted sequence.

Uploaded by

yijoebackup
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lec04 Divide and Conquer (Part 1)

The document discusses the divide-and-conquer algorithm design technique, which involves dividing a problem into smaller subproblems, solving them recursively, and then merging their solutions. It specifically focuses on the Mergesort algorithm, detailing its steps of dividing, recursively sorting, and merging sequences, along with its implementation and performance characteristics. Mergesort is exemplified through a series of recursive calls and merges, illustrating how it sorts an initial sequence into a final sorted sequence.

Uploaded by

yijoebackup
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 63

LECTURE 04

DIVIDE-AND-CONQUER (PART 1)
A Technique of Algorithm Design: Divide
and Conquer
Problems

subproblem Subproblems…
1 n
subproblems
2

Solution to Solution to
subproblem1 Solution to subproblem3
subproblem2

Solution to
problem
Divide and Conquer

 Divide and conquer algorithms has three major steps


 Divide: when input size is too large, data is divided
into two or more disjoint subsets
 Recurse: solve the subproblems associated with
each subset (may divide further)
 Conquer: take solutions from subproblems and
merge into solution for original problem
 Divide and conquer method of algorithm design has
created such efficient algorithms as Merge Sort and
Quick Sort.
Sorting

 Sorting algorithm can be classified (Bhasin, 2015):


 Based on number of comparisons
 Based on number of swaps
 Based on stability
 Based on adaptability
 Based on internal or external memory
 Recursive or non-recursive

 Recursion: the base case for recursion is subproblem


of constant size
MERGESORT
Mergesort (cont.)

 Input: sequence S with n elements


 Three steps:
 Divide: partition S into two sequences S1 and S2 of
about n/2 elements each
 Recur: recursively sort S1 and S2
 Conquer: merge S1 and S2 into a unique sorted
sequence
 Performance efficient mergesort implementation
requires a temporary data structure to store the
merged sequence in the conquer step, therefore
making it a not-in-place algorithm.
Mergesort Algorithm

Algorithm Mergesort (a, p, r) {


Input Parameters: array a, start index p, end index r.
Output Parameter: array a sorted.
// continue if has more than one element.
if (p < r) {
// Divide: divide a into two nearly equal parts.
m = (p + r) / 2
// Recur: sort each half.
Mergesort (a, p, m)
Mergesort (a, m + 1, r)
// Conquer: merge the two sorted halves.
Merge (a, p, m, r)
}
}
p r Mergesort (a, p, r) {
Merges 0 1 if (p < r) { p, r
------- 0
ort 2 4
} 2
Mergesort (a, p, r) {
p r }
if (p < r) {
0 1 2 3
m = (p + r) / 2
2 4 3 1 Mergesort (a, p, m) Mergesort (a, p, r) {
Mergesort (a, p, r) { Mergesort (a, m + 1, if (p < r) { p, r
r) 1
if (p < r) { -------
Merge (a, p, m, r)
m = (p + r) / 2 } 4
}
Mergesort (a, p, m) }
}
Mergesort (a, m + 1,
r) p r
Merge (a, p, m, r) 2 3 Mergesort (a, p, r) {
} if (p < r) { p, r
3 1
} ------- 2
Mergesort (a, p, r) {
} 3
if (p < r) {
}
m = (p + r) / 2
Mergesort (a, p, m)
Mergesort (a, m + 1, Mergesort (a, p, r) {
r) p, r
if (p < r) {
Merge (a, p, m, r) 3
-------
} 1
}
}
}
p r Mergesort (a, p, r) {
Merge 0 1 if (p < r) { p, r
------- 0
2 4
} 2
Mergesort (a, p, r) {
p r }
if (p < r) {
0 1 2 3
m = (p + r) / 2
1 2 3 4 Mergesort (a, p, m) Mergesort (a, p, r) {
Mergesort (a, p, r) { Mergesort (a, m + 1, if (p < r) { p, r
r) 1
if (p < r) { -------
Merge (a, p, m, r)
m = (p + r) / 2 } 4
}
Mergesort (a, p, m) }
}
Mergesort (a, m + 1,
r) p r
Merge (a, p, m, r) 2 3 Mergesort (a, p, r) {
} if (p < r) { p, r
1 3
} ------- 2
Mergesort (a, p, r) {
} 3
if (p < r) {
}
m = (p + r) / 2
Mergesort (a, p, m)
Mergesort (a, m + 1, Mergesort (a, p, r) {
r) p, r
if (p < r) {
Merge (a, p, m, r) 3
-------
} 1
}
}
}
Mergesort – Example
Original Sequence Sorted Sequence
18 26 32 6 43 15 9 1 1 6 9 15 18 26 32 43

18 26 32 6 43 15 9 1 6 18 26 32 1 9 15 43
43

18 26 32 6 43 15 9 1 18 26 6 32 15 43 1 9

18 26 32 6 43 15 9 1 18 26 32 6 43 15 9 1

18 26 32 6 43 15 9 1
Merging Two Sorted Sequences

 The conquer step of mergesort consists of merging


two sorted sequences A and B into a sorted sequence
S containing the union of the elements of A and B
 Merging two sorted sequences, each with n/2
elements and implemented by means of a doubly
linked list, takes O(n) time
Algorithm Merge (a, p, m, r)
Input sequences A = a[p]…a[m] and
B = a[m+1]…a[r] with
Output sorted sequence a[p]…a[r]
S  empty sequence
while A.isEmpty()  B.isEmpty()
if A.first().element() <
B.first().element()

S.insertLast(A.remove(A.first()))
else

S.insertLast(B.remove(B.first()))
while A.isEmpty()
S.insertLast(A.remove(A.first()))
while B.isEmpty()
S.insertLast(B.remove(B.first()))
Merging Two Sorted Sequences

 Combining two sorted sublists into one sorted list is


called merging

REPEAT UNTIL SUBLISTS EMPTY


Mergesort Tree
 An execution of merge-sort is depicted by a binary tree
 each node represents a recursive call of merge-sort and stores
 unsorted sequence before the execution and its partition
 sorted sequence at the end of the execution
 the root is the initial call
 the leaves are calls on subsequences of size 0 or 1

7 29 4  2 4 7 9

72  2 7 94  4 9

77 22 99 44


14
Mergesort Tree Example

 Partition

7 2 9 43 8 6 1

15
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 4

16
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 4

7|2

17
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 4

72

77

18
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 4

72

77 22

19
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 4

722 7

77 22

20
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 4

722 7 9|4

77 22

21
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 4

722 7 9|4

77 22 99

22
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 4

722 7 9|4

77 22 99 44

23
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 4

722 7 9|44 9

77 22 99 44

24
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 42 4 7 9

722 7 9|44 9

77 22 99 44

25
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9

77 22 99 44

26
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|8

77 22 99 44

27
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|8

77 22 99 44 33

28
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|8

77 22 99 44 33 88

29
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|83 8

77 22 99 44 33 88

30
Mergesort Tree Example (cont.)
 Recursive call, partition

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|83 8 6|1

77 22 99 44 33 88

31
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|83 8 6|1

77 22 99 44 33 88 66

32
Mergesort Tree Example (cont.)
 Recursive call, base case

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|83 8 6|1

77 22 99 44 33 88 66 11

33
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8|6 1

722 7 9|44 9 3|83 8 6|116

77 22 99 44 33 88 66 11

34
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 1

7 29 42 4 7 9 3 8 6 11 3 6 8

722 7 9 44 9 3 83 8 6 11 6

77 22 99 44 33 88 66 11

35
Mergesort Tree Example (cont.)
 Merge

7 2 9 43 8 6 11 2 3 4 6 7 8 9

7 29 42 4 7 9 3 8 6 11 3 6 8

722 7 9 44 9 3 83 8 6 11 6

77 22 99 44 33 88 66 11

36
Array Output

 The previous Mergesort tree and the following array


intermediate results are equivalent.
7 2 9 4 3 8 6 1 – Initial Sequence
1. 2 7 9 4 3 8 6 1
2. 2 7 4 9 3 8 6 1
3. 2 4 7 9 3 8 6 1
4. 2 4 7 9 3 8 6 1
5. 2 4 7 9 3 8 1 6
6. 2 4 7 9 1 3 6 8
7. 1 2 3 4 6 7 8 9 – Final Sorted Sequence
37
Exercise
 Show the sorting steps of the following sequence by
using mergesort
18 12 14 23 17 29 31 22
Analysis of Mergesort
 Def. T(n) = number of comparisons to mergesort an input of size n.

 Mergesort recurrence.
 0 if n 1

T(n)   T  n /2   T  n/2   n otherwise

     
merging
  solve left half solve right half


 Solution. T(n) = O(n log2 n).

 Assorted proofs. We describe several ways to prove this


recurrence.
Kleinberg Initially we assume n is a power of 2 and replace  with
& Tados (2014)
Analysis of Mergesort
Since the sort divide input into two pieces, solve 2
subproblems recursively and combine the results, thus,
when n ≥ 2,

This gives us

k=
Use the iterative method:

Thus, T(n) = n +
Since n< , so we have
O()
Analysis of Mergesort: Proof by
Recursion Tree
 0 if n 1

T(n)   2T(n /2)  n otherwise
   
 
 sorting both halves merging
T(n) n

 2(n/2)
T(n/2) T(n/2)

T(n/4) T(n/4) T(n/4) T(n/4) 4(n/4)


log2
n ...
T(n / 2k) 2k (n / 2k)

...

T(2 T(2 T(2 T(2) T(2 T(2 T(2 T(2 n/2 (2)
) ) ) ) ) ) )
Kleinberg & Tados (2014) n log2n
Analysis of Mergesort: Proof by Induction
 Claim. If T(n) satisfies this recurrence, then T(n) = n log2
n. assumes n is a power of 2
 0 if n 1

T(n)   2T(n /2)  n otherwise
   
 
 sorting both halves merging
 Pf. (by induction on n)
 Base case: n = 1.
 Inductive
 hypothesis: T(n) = n log2 n. T(1) = 1
 Goal: show that T(2n) = 2n log2 (2n).
T(2n)  2T(n)  2n
 2n log2 n  2n =
 2nlog2 (2n)  1  2n Note that: =
 2n log2 (2n)
Kleinberg & Tados (2014)
Analysis of Mergesort (summary)

 The height h of the mergesort tree is O(log n)


 at each recursive call we divide in half the
sequence,
 The overall amount or work done at the nodes of
depth i is O(n)
 we partition and merge 2i sequences of size n/2i
 we make 2i+1 recursive calls
 Thus, the total running time of merge-sort is O(n log
n)
Size = n = 8
Height = h = 3

2h = n (eg. 23 = 8)
h = log2 n (eg. 3 = log2 8)
7 2 9 43 8 6 11 2 3 4 6 7 8 9

7 29 42 4 7 9 3 8 6 11 3 6 8

722 7 9 44 9 3 83 8 6 11 6

77 22 99 44 33 88 66 11


After the analysis,

Mergesort:
 It has O(n log n) running time
 This is better than quicksort which has O(n2) at
worst case

 It accesses data in a sequential manner

 It is stable

 It is not-in-place
QUICK-SORT

L E G

47
Quick-Sort

 Divide: pick a random element x


(called pivot) and partition S into
 L elements less than x x
 E elements equal x
 G elements greater than x
 Recur: sort L and G
 Conquer: join L, E and G
x

L E G

48
Algorithm QuickSort

1 quicksort( arr, p, r )
2 if p < r
3 pi = partition( arr, p, r )
4 quicksort( arr, p, pi – 1 )
5 quicksort( arr, pi + 1, r )

49
Partition
 The partition step of quick-sort Algorithm partition(S, p, r)
takes O(n) time Input sequence S, start index p,
end index r
Output subsequences L, E, G
of the
Method to select a pivot: elements of S less
1. Always choose the than, equal to,
first element or greater than the
2. Always choose the pivot, resp.
last element i=p–1
3. Choose the median for all j from p to r – 1
4. Randomly choose an if S[j] <= S[r]
element i++
swap (S[i], S[j])
swap (S[i+1], S[r])
return i + 1

50
Execution Example 1

 Select last element as pivot


Index: 0 1 2 3 4 5 6 7
7 2 9 4 3 7 1 6

Start index p: get the element


smaller than S[r], initially is 0
If (p < r)
p: 0
r: 6

24316 7 7 9

51
Execution Example 2

 Random pivot selection

7 2 9 4 3 7 6 1 

52
Execution Example 2 (cont.)

 Partition, recursive call, pivot selection

7 2 9 4 3 7 6 1 

2 4 3 1 7 9 7 

53
Execution Example 2(cont.)

 Partition, recursive call, base case

7 2 9 4 3 7 6 1 

2 4 3 1 7 9 7 

11 4 3 

54
Execution Example 2(cont.)

 Recursive call, pivot selection

7 2 9 4 3 7 6 1 

2 4 3 1 7 9 7 

11 4 3 

55
Execution Example 2(cont.)

 Partition, recursive call, base case

7 2 9 4 3 7 6 1 

2 4 3 1 7 9 7 

11 4 3 

44

56
Execution Example 2(cont.)

 Join, join

7 2 9 4 3 7 6 1 

2 4 3 1 1 2 3 4 7 9 7 

11 4 3  3 4

44

57
Execution Example 2(cont.)

 Recursive call, pivot selection

7 2 9 4 3 7 6 1 

2 4 3 1 1 2 3 4 7 9 7 

11 4 3 

44

58
Execution Example 2(cont.)

 Partition, recursive, base, recursive, base

7 2 9 4 3 7 6 1 

2 4 3 1 1 2 3 4 7 9 7 

11 4 3  3 4 77 99

44

59
Execution Example 2(cont.)

 Join, join

7 2 9 4 3 7 6 1  1 2 3 4 6 7 7 9

2 4 3 1 1 2 3 4 7 9 7  7 7 9

11 4 3  3 4 77 99

44

60
Worst-case Running Time
 The worst case for quick-sort occurs when the pivot is the unique
minimum or maximum element (already sorted)
 One of L and G has size n - 1 and the other has size 0
 The running time is proportional to the sum
n + (n - 1) + (n - 2) +… + 2 + 1
 Thus, the worst-case running time of quick-sort is O(n2)
depth time
0 n

1 n-1

… …

n-1 1
61
Worst-case Running Time

 The running time is proportional to the sum


 S1 = n + (n - 1) + (n - 2) +… + 2 + 1
 S2 = 1 + 2 +3 +…………….. (n - 1) +(n-2) + n
 Let S1+S2 to get:
It is arithmetic sequence
 (n+1)+(n+2)+(n+3)…..
 Thus, 2S = n(n+1) -1 and finally S =( (n(n+1))/2)-1
 Ignore all the unnecessary terms, we get O(n2)
Expected Running Time

 Quick-sort performs the best when L and G are of equal


size. In this case, a single quick-sort call involves bn work
of partition plus two recursive calls on lists of size n/2,
hence the recurrence relation is (same as merge-sort)
T(n) = 2T(n/2) + bn
 From the Master Theorem, the best case running time of
quick-sort is O(n log n).
 For average case, if a random pivot is selected, it will
reduce the probability of worst case, the expected running
time is also O(n log n).

63

You might also like