0% found this document useful (0 votes)
18 views

Merge and Quick

Uploaded by

SUDIP PATRA
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Merge and Quick

Uploaded by

SUDIP PATRA
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

SSCUSSIONABOUT MERGE SOR

AND
QUICK SORT
MERGE SORT
Merge sort is a sorting
algorithm that follows the
divide-and-conquer
approach. It works by
recursively dividing the input
array into smaller subarrays
and sorting those subarrays
then merging them back
together to obtain the sorted
array.
In simple terms, we can say
that the process of merge
sort is to divide the array
into two halves, sort each
half, and then merge the
sorted halves back together.
This process is repeated until
#include <stdio.h>
#include <stdlib.h>
ALGORITHM
// Copy the remaining
// Merges two subarrays of
elements of L[],
arr[].
// if there are any
// First subarray is arr[l..m]
while (i < n1) {
// Second subarray is
arr[k] = L[i];
arr[m+1..r]
i++;
void merge(int arr[], int l, int
k++;
m, int r)
}
{
int i, j, k;
// Copy the remaining
int n1 = m - l + 1;
elements of R[],
int n2 = r - m;
// if there are any
while (j < n2) {
// Create temp arrays
arr[k] = R[j];
int L[n1], R[n2];
j++;
k++;
// Copy data to temp arrays
}
L[] and R[]
}
for (i = 0; i < n1; i++)
L[i] = arr[l + i];
// l is for left index and r is
for (j = 0; j < n2; j++)
right index of the
R[j] = arr[m + 1 + j];
// sub-array of arr to be sorted
// Merge the temp arrays
void mergeSort(int arr[], int l,
back into arr[l..r
int r)
i = 0;
{
j = 0;
if (l < r) {
k = l;
int m = l + (r - l) / 2;
while (i < n1 && j < n2) {
if (L[i] <= R[j]) {
// Sort first and second
arr[k] = L[i];
halves
i++;
mergeSort(arr, l, m);
}
mergeSort(arr, m + 1, r);
else {
arr[k] = R[j];
Recurrence Relation of Merge Sort:
The recurrence relation of merge sort is:
T(n)=Θ(1), if n=1
T(n)=2T(n/2)+Θ(n)​, if n>1​
•T(n) Represents the total time taken by the algorithm to sort an array of
size n.

•2T(n/2) represents time taken by the algorithm to recursively sort the


two halves of the array. Since each half has n/2 elements, we have two
recursive calls with input size as (n/2).

•O(n) represents the time taken to merge the two sorted halves

Complexity Analysis of Merge Sort:


•Time Complexity:

• Best Case: O(n log n), When the array is already sorted or
nearly sorted.
• Average Case: O(n log n), When the array is randomly ordered.
• Worst Case: O(n log n), When the array is sorted in reverse
order.
•Space Complexity: O(n), Additional space is required for the temporary
array used during merging.
Advantages of Merge Sort:
•Stability : Merge sort is a stable sorting algorithm, which means it
maintains the relative order of equal elements in the input array.

•Guaranteed worst-case performance: Merge sort has a worst-case


time complexity of O(N logN) , which means it performs well even on
large datasets.

•Simple to implement: The divide-and-conquer approach is


straightforward.

•Naturally Parallel : We independently merge subarrays that makes it


suitable for parallel processing.

Disadvantages of Merge Sort:


•Space complexity: Merge sort requires additional memory to store
the merged sub-arrays during the sorting process.

•Not in-place: Merge sort is not an in-place sorting algorithm, which


means it requires additional memory to store the sorted data. This can
be a disadvantage in applications where memory usage is a concern.

•Slower than QuickSort in general. QuickSort is more cache friendly


because it works in-place.
Applications of Merge Sort:
•Sorting large datasets

•External sorting (when the dataset is too large to fit in memory)

•Inversion counting

•Merge Sort and its variations are used in library methods of


programming languages. For example its variation TimSort is used in
Python, Java Android and Swift. The main reason why it is preferred to
sort non-primitive types is stability which is not there in QuickSort. For
example Arrays.sort in Java uses QuickSort while Collections.sort uses
MergeSort.

•It is a preferred algorithm for sorting Linked lists.

•It can be easily parallelized as we can independently sort subarrays


and then merge.

•The merge function of merge sort to efficiently solve the problems like
union and intersection of two sorted arrays.
QUICK SORT
QuickSort is a sorting
algorithm based on the
Divide and Conquer that
picks an element as a pivot
and partitions the given array
around the picked pivot by
placing the pivot in its correct
position in the sorted array.
ALGORITHM
#include <stdio.h> // The QuickSort function
implementation
void swap(int* a, int* b); void quickSort(int arr[], int
low, int high) {
// Partition function if (low < high) {
int partition(int arr[], int low,
int high) { // pi is the partition
return index of pivot
// Choose the pivot int pi = partition(arr, low,
int pivot = arr[high]; high);

// Index of smaller element // Recursion calls for


and indicates smaller elements
// the right position of pivot // and greater or equals
found so far elements
int i = low - 1; quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1,
// Traverse arr[low..high] high);
and move all smaller }
// elements to the left side. }
Elements from low to
// i are smaller after every void swap(int* a, int* b) {
iteration int t = *a;
for (int j = low; j <= high - *a = *b;
1; j++) { *b = t;
if (arr[j] < pivot) { }
i++;
swap(&arr[i], &arr[j]); int main() {
} int arr[] = {10, 7, 8, 9, 1,
} 5};
int n = sizeof(arr) /
// Move pivot after smaller sizeof(arr[0]);
elements and
Complexity Analysis of Quick
Sort
Time Complexity:
•Best Case: (Ω(n log n)), Occurs when the pivot
element divides the array into two equal halves.

•Average Case (θ(n log n)), On average, the


pivot divides the array into two parts, but not
necessarily equal.

•Worst Case: (O(n²)), Occurs when the smallest


or largest element is always chosen as the pivot
(e.g., sorted arrays).

Space Complexity: O(1)


Advantages of Quick Sort
•It is a divide-and-conquer algorithm that makes it easier to solve problems.

•It is efficient on large data sets.

•It has a low overhead, as it only requires a small amount of memory to function.

•It is Cache Friendly as we work on the same array to sort and do not copy data
to any auxiliary array.

•Fastest general purpose algorithm for large data when stability is not required.

•It is tail recursive and hence all the tail call optimization can be done.

Disadvantages of Quick Sort


•It has a worst-case time complexity of O(n2), which occurs when the pivot is
chosen poorly.

•It is not a good choice for small data sets.

•It is not a stable sort, meaning that if two elements have the same key, their
relative order will not be preserved in the sorted output in case of quick sort,
because here we are swapping elements according to the pivot’s position
(without considering their original positions).
Applications of Quick Sort
•Efficient for sorting large datasets with O(n log n) average-case time
complexity.

•Used in partitioning problems like finding the kth smallest element or


dividing arrays by pivot.

•Integral to randomized algorithms, offering better performance than


deterministic approaches.

•Applied in cryptography for generating random permutations and


unpredictable encryption keys.

•Partitioning step can be parallelized for improved performance in


multi-core or distributed systems.

•Important in theoretical computer science for analyzing average-case


complexity and developing new techniques.
References
• Data Structure and Algorithm, Made Easy, Narasimha
Karumanchi.
• www.geeksforgeeks.org
• www.stackoverflow.org
• INTRODUCTION TO ALGORITHM by CORMEN
THANK YOU

Presente
d by-
SUDIP PATRA| NIRJAN MONDAL

You might also like