0% found this document useful (0 votes)
2 views

DAA Unit 1 Notes

The document provides an overview of various sorting algorithms, including Insertion Sort, Selection Sort, Bubble Sort, Merge Sort, Quick Sort, Heap Sort, Shell Sort, Counting Sort, Radix Sort, and Bucket Sort, detailing their algorithms, time complexities, and space complexities. Each sorting method is explained with its respective best, average, and worst-case scenarios. The document emphasizes the differences in performance and use cases for each sorting technique.

Uploaded by

itsprinnnce
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

DAA Unit 1 Notes

The document provides an overview of various sorting algorithms, including Insertion Sort, Selection Sort, Bubble Sort, Merge Sort, Quick Sort, Heap Sort, Shell Sort, Counting Sort, Radix Sort, and Bucket Sort, detailing their algorithms, time complexities, and space complexities. Each sorting method is explained with its respective best, average, and worst-case scenarios. The document emphasizes the differences in performance and use cases for each sorting technique.

Uploaded by

itsprinnnce
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 48

SORTING ALGORITHMS

AND ANALYSIS OF TIME


AND SPACE COMPLEXITY
Comparison of Sorting Algorithms
•.
INSERTION SORT
• Insertion sort is a simple sorting algorithm that works similar to the
way you sort playing cards in your hands.
• The array is virtually split into a sorted and an unsorted part.
• Values from the unsorted part are picked and placed at the correct
position in the sorted part.
ALGORITHM: INSERTION SORT (A)
1. for j = 2 to A.length
2.
key = A[j] // Insert A[j] into the sorted sequence A[1.. j -
1]
3. i=j-1
4. while i > 0 and A[i] > key
5. A[i + 1] = A[i]
6. i = i -1
7. A[i + 1] = key
ALGORITHM: INSERTION SORT (A)
• Time Complexity of Insertion Sort
The worst-case time complexity of the Insertion sort is
O(N2)
The average case time complexity of the Insertion sort is
O(N2)
The time complexity of the best case is O(N).

• Space Complexity of Insertion Sort


The auxiliary space complexity of Insertion Sort is O(1)
SELECTION SORT
• Selection sort is a simple and efficient sorting algorithm that works by
repeatedly selecting the smallest (or largest) element from the
unsorted portion of the list and moving it to the sorted portion of the
list.
ALGORITHM: SELECTION SORT (A)
.
ALGORITHM: SELECTION SORT (A)
• Time Complexity of Selection Sort
The time complexity of Selection Sort is O(N2) as there
are two nested loops
• Space Complexity of Selection Sort
The auxiliary space complexity of Insertion Sort is O(1)
BUBBLE SORT
In Bubble Sort algorithm,
• traverse from left and compare adjacent elements and the higher one
is placed at right side.
• In this way, the largest element is moved to the rightmost end at first.
• This process is then continued to find the second largest and place it
and so on until the data is sorted.
ALGORITHM: BUBBLE SORT (A)
1. void bubbleSort(int arr[], int n)
2. {
3. int i, j;
4. bool swapped;
5. for (i = 0; i < n - 1; i++) {
6. swapped = false;
7. for (j = 0; j < n - i - 1; j++) {
8. if (arr[j] > arr[j + 1]) {
9. swap(&arr[j], &arr[j + 1]);
10. swapped = true;
11. }
12. }
13. if (swapped == false) // If no two elements were swapped by inner loop, then
break
14. break;
15. }
16. }
ALGORITHM: BUBBLE SORT (A)
• Time Complexity of Bubble Sort
Best Case Complexity: O(n)
Average Case Complexity: O(n2)
Worst Case Complexity: O(n2)

Space Complexity of Bubble Sort


The auxiliary space complexity of Insertion Sort is O(1)
DIVIDE AND CONQUER ALGORITHM
This technique can be divided into the following three parts:

Divide: This involves dividing the problem into smaller sub-problems.


Conquer: Solve sub-problems by calling recursively until solved.
Combine: Combine the sub-problems to get the final solution of the
whole problem.

To use the divide and conquer algorithm, recursion is used.


MERGE SORT ALGORITHM
Merge sort is defined as a sorting algorithm that works by dividing an
array into smaller subarrays, sorting each subarray, and then merging
the sorted subarrays back together to form the final sorted array.
MERGE SORT
.ALGORITHM
TIME COMPLEXITY MERGE SORT ALGORITHM
.

Case Time
Complexity
Best Case O(n*logn)
Average Case O(n*logn)
Worst Case O(n*logn)
QUICK SORT ALGORITHM
QuickSort is a sorting algorithm based on the Divide and
Conquer algorithm that picks an element as a pivot and
partitions the given array around the picked pivot by
placing the pivot in its correct position in the sorted array.
QUICK SORT ALGORITHM
.
QUICK SORT ALGORITHM
.
TIME COMPLEXITY QUICK SORT ALGORITHM
• Best Case: Ω (N log (N))
• Average Case: θ ( N log (N))
• Worst Case: O(N2)
Heap Sort
• A Heap is a special Tree-based data structure in which the tree is a
complete binary tree.

• The time complexity would also be O(logn).


Heap Sort
Heap Tree Operations

• Heapify: a process of creating a heap from an array.


• Insertion: process to insert an element in existing heap time
complexity O(log N).
• Deletion: deleting the top element of the heap or the highest priority
element, and then organizing the heap and returning the element
with time complexity O(log N).
Heap Sort
• The worst-case time complexity of heapify depends on the algorithm
used. The worst-case time complexity of bottom-up heapify is O(n),
where n is the size of the heap. (Insertion together)
• The worst-case time complexity of top-down heapify is O(n log n),
where n is the size of the heap. (Insertion one by one)
Heap Sort
BUILD HEAP
44, 33, 77, 11, 55, 88, 66
Heap Sort

.
Heap Sort
.
Heap Sort

.
Heap Sort

.
Heap Sort
Shell Sort
• Shell sort is mainly a variation of Insertion Sort. In insertion sort, we
move elements only one position ahead.
• When an element has to be moved far ahead, many movements are
involved. The idea of ShellSort is to allow the exchange of far items.
• In Shell sort, we make the array h-sorted for a large value of h. We
keep reducing the value of h until it becomes 1.
• An array is said to be h-sorted if all sublists of every h’th element are
sorted.
Shell Sort
ShellSort(a, n) // 'a' is the given array, 'n' is the size of
array
1. for (interval = n/2; interval > 0; interval /= 2)
2. for ( i = interval; i < n; i += 1)
3. temp = a[i];
4.
for (j = i; j >= interval && a[j - interval] > temp; j -
= interval)
5. a[j] = a[j - interval];
6. a[j] = temp;
Shell Sort
•.
Shell Sort
•.
Shell Sort Time Complexity
Case Time Complexity

Best Case O(n*logn)


Average Case O(n*log(n)2)
Worst Case O(n2)

• Space Complexity O(1)


COUNTING SORT
• Counting Sort is a non-comparison-based sorting algorithm that works
well when there is limited range of input values.
• It is particularly efficient when the range of input values is small
compared to the number of elements to be sorted.
• The basic idea behind Counting Sort is to count the frequency of each
distinct element in the input array and use that information to place
the elements in their correct sorted positions.
COUNTING SORT
•.
Given array COUNTING SORT

Count array
COUNTING SORT Time Complexity
• Best Case O(n + k)
• Average Case O(n + k)
• Worst Case O(n + k)

• Space Complexity O(max)


RADIX SORT
• Radix Sort is a linear sorting algorithm that sorts elements by
processing them digit by digit. It is an efficient sorting algorithm for
integers or strings with fixed-size keys.
RADIX SORT
Algorithm: RadixSort(a[], n):

// Find the maximum element of the list


1. max = a[0]
2. for (i=1 to n-1):
3. if (a[i]>max):
4. max=a[i] //applying counting sort for each digit in
each number of the input list
5. For (pos=1 to max/pos>0):
6. countSort(a, n, pos)
7. pos=pos*10
RADIX SORT
•.
RADIX SORT Time Complexity
• For the radix sort implementation that uses counting sort as an
intermediate stable sort, the time complexity for worst-, best- and
average-case scenario is O(d*(n+b)).
Here,
• d is the number of digits in the maximum number
• O(n+b) is the time complexity of counting sort, where b is the base of
the number system used.
Radix Sort Space Complexity
• Space complexity of radix sort is O(n+b) because we use a couple of
additional arrays — count array of size b and sorting array of size n.
BUCKET SORT
• Bucket sort, also known as bin sort, is a sorting algorithm that divides
an array's elements into several buckets. The buckets are then sorted
one at a time, either using a different sorting algorithm or by
recursively applying the bucket sorting algorithm.
BUCKET SORT
Input array: [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
BUCKET SORT
Input array: [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
BUCKET SORT
Input array: [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
BUCKET SORT
Input array: [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
BUCKET SORT
Input array: [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
BUCKET SORT TIME COMPLEXITY
Time Complexity:
• Best Case O(n + k)
• Average Case O(n + k)
• Worst Case O(n2)

Auxiliary Space: O(n+k)

You might also like