0% found this document useful (0 votes)
7 views43 pages

Divide and Conquer

The document discusses the Divide-and-Conquer technique, outlining its three main steps: dividing a problem into smaller instances, solving those instances recursively, and combining their solutions. It provides examples such as Merge Sort, Quick Sort, and Strassen's algorithm for matrix multiplication, detailing their time complexities and recursive analyses. The document also introduces the Master Theorem for analyzing recurrences in Divide-and-Conquer algorithms.

Uploaded by

fukraapps
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views43 pages

Divide and Conquer

The document discusses the Divide-and-Conquer technique, outlining its three main steps: dividing a problem into smaller instances, solving those instances recursively, and combining their solutions. It provides examples such as Merge Sort, Quick Sort, and Strassen's algorithm for matrix multiplication, detailing their time complexities and recursive analyses. The document also introduces the Master Theorem for analyzing recurrences in Divide-and-Conquer algorithms.

Uploaded by

fukraapps
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 43

Divide-and-Conquer

Divide-and-Conquer Technique

The most-well known algorithm design strategy:


1. Divide instance of problem into two or more
smaller instances

2. Solve smaller instances recursively

3. Obtain solution to original (larger) instance by


combining these solutions
Divide-and-Conquer Technique
(cont.)
a problem of size n
(instance)

subproblem 1 subproblem 2
of size n/2 of size n/2

a solution to a solution to
subproblem 1 subproblem 2

a solution to It general leads to a


the original problem
recursive algorithm!
Divide-and-Conquer
Examples
 Merge sort

 Quicksort

 Matrix multiplication: Strassen’s algorithm

 Multiplication of large integers


Mergesort
Merge sort Example
8 3 2 9 7 1 5 4

8 3 2 9 7 1 5 4

8 3 2 9 71 5 4

8 3 2 9 7 1 5 4

3 8 2 9 1 7 4 5

2 3 8 9 1 4 5 7

1 2 3 4 5 7 8 9
Pseudocode of Mergesort
Pseudocode of Mergesort

Time complexity: Θ(p + q) = Θ(n)


comparisons
Merge
• B=[2, 4, 15, 20] C= [ 1, 16, 100, 120] merge B &C in A
• i=0 j=0 k=0
• B[i]=2 > C[j] = 1 therefore A[0]=1 k=1, j=1
• B[i]=2 < c[j]=16 therefore A[1]= 2 k=2, i=1
• B[i]=4 < c[j]=16 A[2]=4 i=2, k = 3
• B[i]=15 < c[j]=16 A[3]= 15 i=3, k=4
• B[i]=20 > c[j]=16 A[4]= 16 k=4, j= 2
• B[i]=20 < c[2]=100 A[5] = 20 k=5 , i=4
• i>3 therefore copy remaining element of C in A
• A[6]=100 A[7]=120
• A[] =[1, 2, 4, 15, 16, 20, 100, 120]
Analysis of
Mergesort
Let T(n) denote time complexity of sorting n
elements using merge sort then :

T(n) = 2T(n/2) + cn, T(1) = 0 for some


constant c
Analysis of Merge
sort
For simplification purpose assume n = 2k for some positive constant k
T(n) = 2T(n/2) + Cn, where c is a constant
T(n) = 2[2T(n/22) + Cn/2] + Cn = 22 T(n/ 22) + 2Cn
T(n) = 22[2T(n/23) + Cn/22] + 2Cn
T(n) = 23 T(n/ 23) + 3Cn
………..
………..
………..
T(n) = 2k T(n/ 2k) + kCn

Therefore, T(n) = 2k T(1) + kCn = k Cn (since T(1)=0)

Which implies T (n) = O (kn) = O(n logn)


[since n = 2k, k = log n]
Analysis of Merge
sort (Using Masters
Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd),
d0
T(n) = 2T(n/2) + cn, T(1) = 0 for
Master Theorem: If a < b , T(n)  (n )
d d
some constant c
If a = bd, T(n)  (nd
log n)
If a > bd, T(n) 
Comparing
a the Recurrence Relation with the Masters Theorem we get
(nlog b )
a= 2, b=2, d=1
Using Masters Theorem
a = bd

Therefore T(n)  (nd log n)

=> T(n)  (n log n)


Quick Sort
Quick Sort
• Small instance has n <= 1. Every small instance is a sorted instance.
• To sort a large instance, select a pivot element from out of the n
elements.
• Partition the n elements into 3 groups left, middle and right.
• The middle group contains only the pivot element.
• All elements in the left group are <= pivot.
• All elements in the right group are >= pivot.
• Sort left and right groups recursively.
• Answer is sorted left group, followed by middle group followed by
sorted right group.
Example

6 2 8 5 11 10 4 1 9 7 3
Use 6 as the pivot.
10 11

Sort left and right groups recursively.


Choice Of Pivot

• Pivot is leftmost element in list that is to be sorted.


 When sorting a[6:20], use a[6] as the pivot.
• Randomly select one of the elements to be sorted
as the pivot.
 When sorting a[6:20], generate a random number r in
the range [6, 20]. Use a[r] as the pivot.
Choice Of Pivot

• Median-of-Three rule. From the leftmost, middle, and


rightmost elements of the list to be sorted, select the
one with median key as the pivot.
 When sorting a[6:20], examine a[6], a[13] ((6+20)/2), and
a[20]. Select the element with median (i.e., middle) key.
Choice Of Pivot

Median
Complexity
• O(n) time to partition an array of n elements.
• Let t(n) be the time needed to sort n elements.
• t(0) = t(1) = d, where d is a constant.
• When t > 1,
t(n) = t(|left|) + t(|right|) + cn,
where c is a constant.
• t(n) is maximum when either |left| = 0 or |right| =
0 following each partitioning.
Complexity
• This happens, for example, when the pivot is always the
smallest or largest element.
• For the worst-case time,
t(n) = t(n-1) + cn, n > 1
• Use repeated substitution to get t(n) = O(n2).
• The best case arises when |left| and |right| are equal (or
differ by 1) following each partitioning.
• For the best case, the recurrence is the same as for
merge sort. T(n) = 2T(n/2) + Cn, where c is a constant
Analysis of
Quicksort
Best case: split in the middle (same as merge sort)— O(n
log n)
Worst case: sorted array! — O(n2)

T(n) = T(n-1) + Θ(n) , T(1) = 0


T(n) = T(n-1) + cn = T(n-2) + c(n-1) + cn
………
………
T(n)= T(1) + c( n + (n-1) + … + 2) = O(n2)

Average case: random arrays — Θ(n log n) ?


Analysis of Quick sort
Best Case (Using
Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd),
d0
T(n) = 2T(n/2) + Cn
Master Theorem: If a < b , T(n)  (n )
d d

If a = bd, T(n)  (nd


log n)
If a > bd, T(n) 
Comparing
a the Recurrence Relation with the Masters Theorem we get
(nlog b )
a= 2, b=2, d=1
Using Masters Theorem
a = bd

Therefore T(n)  (nd log n)

=> T(n)  (n log n)


Average Time Complexity of
Quick Sort
• Let t(n) denote average time complexity of sorting
n elements using quick sort
• For n <=1, t(n) = d , for some constant d.
• t(n) <= cn + 1/n [ ∑0 n-1 (t(s) + t(n-s-1) ], where s
and n-s-1 denote number of elements in the left
segment and the right segment respectively.
Average Time Complexity of
Quick Sort

Proof by induction: We show that t(n) <= kn logn, where k=


2(c+d)
Base case n=2, t(2) <= 2(c+d)
Assume that theorem is true for n<m , now to prove the
theorem for n = m
Average Time Complexity of
Quick Sort
Matrix Multiplication
Conventional Matrix
Multiplication

for (int i=0; i< n ; i++)


for (int j=0; j<n; j++)
{
c[i][j]=0;
for (int k=0; i<n; k++)
c[i][j]+= A[i][k]*B[k][j];
}
O(n3)
Matrix Multiplication : Divide and
Conquer

T(n) = aT(n/b) + f (n)


T(n) = 8 T(n/2) + cn2 for where f(n)  (nd), d 
n>=2 and T(1)=d 0
Solve ! ? Master’s Theorem:
If a < bd, T(n)  (nd)
If a = bd, T(n)  (nd
log n)
If a > bd, T(n)  (nlog
• T(n) = 8 T(n/2) + cn2 = 8 [8T(n/22) + cn2/4] + cn2
• = 82T(n/22) + cn2 [1 + 2]
• = 82[8T(n/23) + cn2/16] + cn2 [1 + 2 + 4]
• = 83T(n/23) + cn2 [1 + 2 + 22]

• ……
• ……
• = 8kT(n/2k) + cn2 [1 + 2+ 22 +… + 2k-1] =
O(n3) (Since n= 2k)
Analysis of Matrix
Multiplication (Using
Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd),
d0
T(n) = 8 T(n/2) + cn2
Master Theorem: If a < bd, T(n)  (nd)
If a = bd, T(n)  (nd
log n)
If a > bd, T(n) 
Comparing the Recurrence Relation with the Masters Theorem we get
log b a
(n8,
a= b=2, )d=2

Using Masters Theorem


a > bd

Therefore T(n)  (nlog b a )


=> T(n)  (nlog 28 ) =  (n3)
Strassen’s Matrix Multiplication
Formulas for Strassen’s
Algorithm

T(n) = 7 T(n/2) + cn2 T(2) = d, for some constant d


Analysis of Strassen’s Algorithm

T(n) = 7 T(n/2) + cn2 T(1) = d, for some


constant d
T(n) = 7 T(n/2) + cn2 = 7 [7 T(n/22) +
c(n/2)2 ] + cn2 =
= 72 T(n/22) + cn2 [1 + 7/4]
= 73 T(n/23) + cn2 [1 + 7/4 + (7/4)2]
…..
…..
= 7k T(n/2k) + cn2 [1 + 7/4 + (7/4)2 +
… (7/4)k-1]

= 7k T(n/2k) + cn2 (7/4)k / [7/4 – 1] =


O(7k )
= O(7logn ) = O(nlog 7 ) = O(n2.81)
Analysis of Strassen’s
Algorithm (Using
Masters Theorem)
T(n) = aT(n/b) + f (n) where f(n)  (nd),
d0
T(n) = 7 T(n/2) + cn2
Master Theorem: If a < bd, T(n)  (nd)
If a = bd, T(n)  (nd
log n)
If a > bd, T(n) 
Comparing the Recurrence Relation with the Masters Theorem we get
log b a
(n7,
a= b=2, )d=2

Using Masters Theorem


a > bd

Therefore T(n)  (nlog b a )


=> T(n)  (nlog 27)
General Divide-and-Conquer
Recurrence
T(n) = aT(n/b) + f (n) where f(n)  (n ), d  0
d

Master Theorem: If a < bd, T(n)  (nd)


If a = bd, T(n)  (nd log n)
If a > bd, T(n)  (nlog b a )

Note: The same results hold with O instead of .

Examples: T(n) = 4T(n/2) + n  T(n)  ?


T(n) = 4T(n/2) + n2  T(n)  ?
T(n) = 4T(n/2) + n3  T(n)  ?
Example: Strassen’s Matrix Multiplication
Exercise
Multiply the following two matrix using Strassen’s Matrix
Multiplication Method. Multiply 2 x 2 matrix conventionally.

5 10 15 20 1 2 3 4
5 10 15 20 5 6 7 8
1 2 3 4 9 10 11 12
5 6 7 8 13 14 15 16

450 500 550 600

450 500 550 600

90 100 110 120

202 228 254 280


Exercice
Solve the following recurrence equations using substitution Method and
Master’s theorem. In all the examples considered take t(1)=1:
Large Integer Multiplication using
Divide and Conquer Approach
Time Complexity

You might also like