0% found this document useful (0 votes)
2 views

Sorting Group 5

This document provides an overview of various sorting algorithms, including Bubble Sort, Merge Sort, Selection Sort, Insertion Sort, Shell Sort, Radix Sort, Quick Sort, and Heap Sort, detailing their methodologies, time complexities, and use cases. It emphasizes the importance of sorting in data organization and highlights the suitability of each algorithm for different scenarios based on dataset size and memory constraints. The document also includes a list of group members involved in the study of sorting algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Sorting Group 5

This document provides an overview of various sorting algorithms, including Bubble Sort, Merge Sort, Selection Sort, Insertion Sort, Shell Sort, Radix Sort, Quick Sort, and Heap Sort, detailing their methodologies, time complexities, and use cases. It emphasizes the importance of sorting in data organization and highlights the suitability of each algorithm for different scenarios based on dataset size and memory constraints. The document also includes a list of group members involved in the study of sorting algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

DATA STRUCTURE &

ALGORITHM
(SORTING ALGORITHM)
GROUP 5 MEMBERS
1. Robiu Olalere 21/52HA104
2. Amodu Ayomide Matthew 21/52HA043
3. Amaogu Faith Ihedinma 21/52HA042
4. Awarun David Ayotomiwa 21/52HA047
5. Aliu Ibrahim Hammed 21/52HA041
6. Ayinla Fathiu Iyanuoluwa 21/52HA049
7. Ayeni Divine-gift Adewale 21/52HA048
8. Asuku Anavaami David 21/52HA046
9. Bolarinwa Mubarak Ajibola
⁠ 21/25PJ018
10. Ayoola Israel Muyiwa 21/52HA050
11. Aribidesi Abodunrin mariyam 21/25PJ016
12. Babalola Olawale Abdulqaadir 21/52HA051
13. Amoo Oluwafemi Abraham 21/52HA045
14. Artemas Hyelda maina 21/25pj017
SORTING ALGORITHM
Sorting is a technique to rearrange the elements of a list in ascending or descending order, which can be numerical, lexicographical, or any user-
defined.
Why Study Sorting.
When an input is sorted, many problems become easy(e.g.. Searching, Min, Max, K-th Smallest), Sorting has a variety of interesting algorithmic
solutions that embody many ideas.
 Iterative
 Recursive
 Divide and conquer
 Best/worst/Average case Bounds
 Randomized Algorithm
Application of Sorting
• Uniqueness testing
• Deleting duplicates
• Prioritizing events
• Frequency counting
• Reconstructing the original order
• Set inter section/union
CLASSIFICATION OF SORTING

Internal Sorting: If all the data that is to be stored can be accommodated at a time in
memory, it is internal sorting, it uses only the primary memory. There is a limitation for
internal sorts, they can only process relatively small lists due to memory. E.g. Selection
sort(Selection and Heap sort Algorithm), insertion(Insertion and Shell sort Algorithm),
Exchange(Bubble and Quick sort Algorithm)

External Sorting: Sorting large amount of data requires memory. This process
uses external memory to store the data which is not fit into the main
memory. All external sorts are based on process of merging. Different parts
of data are sorted separately and merged together. E.g. Merge sort.
TERMINOLOGIES

In-Place Sorting - A sort algorithm is said to be an in-place sort , If it requires only a


constant amount (i.e. O(1)) of extra space during the sorting process.
Stable Sorting - A sorting algorithm is stable if the relative order of elements with the same
key value is preserved by the algorithm
Non-Stable Sorting - A sorting algorithm is non-stable if the relative order of elements
with the same key value is not preserved by the algorithm.
BUBBLE SORT
 In bubble sort method, list is divided into two sub list sorted and
unsorted. This method is easy to understand but time consuming.Two
successive elements are compared and swapping is done. Thus , step-by-
step entire array elements are checked using two nested loops.
 Bubble sort is inefficient with a time complexity O(n^2)
 Worst Case : input is in descending order O(n^2) time complexity
 Best case : input is in ascending order O(n) time complexity
ALGORITHM & EXAMPLES

1. Algorithm Bubble Sort(list)


2. Pre: list ≠ ∅
3. Post: list has been sorted into values of ascending order
4. for i ← 0 to list.Count - 1
5. for j ← 0 to list.Count - 1
6. if list[j] > list[j + 1] then
7. Swap(list[j], list[j + 1])
8. end if
9. end for
10. end for
11. return list
12. End Bubble Sort
MERGE SORT

 Merge sort is based on the divide and conquer paradigm that splits the data into two
halves, recursively sorts them and then merges them back together in sorted order.
 Merge sort has a time complexity of O(n log n), making it suitable for large datasets
 Best, Average, worst case: O(nLogn)
 Merge sort is a stable sorting algorithm, meaning that equal elements maintain their
original order.
 Merge sort can be parallel, making it a good choice for multi-core processors.
 Real world application in database sorting, file system sorting and data analysis
 Merge sort has a more complex implementation compared to other sorting algorithms.
 Merge sort requires additional memory to store the merged list.
ALGORITHM & EXAMPLES

//list(0 to m-1)
//list(m to list.count-1)
SELECTION SORT
 Selection sort is a simple sorting algorithm that works by repeatedly finding the minimum
element from the unsorted part of the list and swapping it with the first unsorted element.
 it divides the list into two parts: the sorted portion and the unsorted portion. It iteratively
selects the smallest element from the unsorted portion and swaps it with the first element of
that portion.
 Selection Sort can disrupt the relative order of equal elements if swapping occurs.
 Selection Sort is simple but inefficient compared to algorithms like Quick Sort or Merge Sort,
especially for large datasets. However, it can be a good choice for small lists or situations
where memory usage is a constraint.

• Best Case: O(n^2)


• Average Case: O(n^2)
• Worst Case: O(n^2)
ALGORITHM & EXAMPLES
 algorithm SelectionSort(list)
 2) Pre: list ≠ ∅
 3) Post: list has been sorted into values of ascending order
 4) for i ← 0 to list.Count - 2
 5) minIndex ← i
 6) for j ← i + 1 to list.Count - 1
 7) if list[j] < list[minIndex]
 8) minIndex ← j
 9) end if
 10) end for
 11) if minIndex ≠ i
 12) Swap(list[i], list[minIndex])
 13) end if
 14) end for
 15) return list
 16) end SelectionSort
INSERTION SORT
 Insertion sort is a simple sorting algorithm that works by dividing the input into a sorted and an
unsorted region. Each subsequent element from the unsorted region is inserted into the sorted
region in it’s correct position. It can be thought of as a sorting scheme similar to that of sorting
a hand of playing cards.
 Insertion sort is one of the simplest sorting algorithms to understand and implement.
 Insertion sort is efficient for small arrays..
 Insertion sort is not suitable for real-time data, as it can be slow for large datasets.
 Best case: O(n)-if list is already sorted
 Worst case: O(n^2)-list is in reverse order
 Average case: O(n^2)- random order lists.
 It is a stable sorting Algorithm
ALGORITHM & EXAMPLES
SHELL SORT
 Shell sort is a comparison-based sorting algorithm that generalizes the insertion sort algorithm
by allowing the comparison and exchange of far-apart elements. It can perform some
optimizations that insertion sort cannot.
 The Shell Sort algorithm is an optimization of the Insertion Sort algorithm that works by
comparing and swapping elements over larger intervals (gaps), progressively reducing the gap
size until it becomes 1, effectively performing a final insertion sort. This gap-based approach
improves performance by reducing the number of comparisons and swaps needed for distant
elements.
• Best Case: O(nlogn) — Achieved with optimal gap sequences.
• Average Case: Between O(n^1.25) and O(n^1.5), depending on the gap sequence.
• Worst Case: O(n^2) — For poorly chosen gaps (e.g., always halving the gap).
 Shell Sort is not stable, as swapping elements across gaps can change their relative order.
ALGORITHM & EXAMPLES
RADIX SORT
 Radix is a non comparative sorting technique that sorts elements by processing individual digits or
characters, starting from the least significant digit to the most significant.
 Radix sort is a non-comparative integer sorting algorithm that sorts data with integer keys by grouping
keys by the individual digits (or by radix) which share the same significant position and value.
 Radix sort has a time complexity of O(nk), where n is the number of elements and k is the number of
digits, for worst, best and average cases.
 Radix sort is a stable sorting algorithm.
 Radix sort is only suitable for sorting integers or strings with a fixed length, Radix sort requires extra
memory to store the buckets.
 Real-World Applications
 Database sorting: Radix sort can be used to sort large datasets in databases.
 File system sorting: Radix sort can be used to sort files and directories in file systems.
 Data analysis: Radix sort can be used to sort and process large datasets in data analysis.
ALGORITHM & EXAMPLES
QUICK SORT
 This is another divide and conquer paradigm based on partition, also called partition
exchange sorting. it’s basic concept is pick one element from an array and rearrange the
remaining.it is considered in practice to be the most efficient sorting algorithm for arrays of
data stored in local memory.
 Quick sort is generally faster than other sorting algorithms like merge sort and heap sort.
 Quick sort can be implemented as an in-place sorting algorithm, meaning it doesn't require
additional storage.
 Best case: O(nlogn)- when the pivot divides the list into roughly equal
 Average case: O(nlogn)- typical case for randomized or well distributed data
 Wort case: O(n^2)- when the pivot consistently partitions poorly(e.g., smallest or largest
element).
PICKING A PIVOT

 First element: Choose the first element of the list as the pivot.

 Last element: Choose the last element of the list as the pivot.

 Median of three: Choose the median of the first, middle, and last elements as the pivot.

 Random pivot: Choose a random element from the list as the pivot.
ALGORITHM & EXAMPLES
HEAP SORT
 Heap sort is a comparison-based algorithm that utilizes a binary heap data structure, it first builds a heap
from the input list, then repeatedly extracts the maximum or minimum element from the heap and places it
into the sorted portion of the list. The heap is reconstructed after each extraction.
 Heapify is the process of maintaining the heap property after insertion or deletion of an element, it involves
comparing the element with it’s children and swapping if necessary
 Heap Sort is efficient for large datasets with guaranteed O(nlogn) performance, but it is not stable and
involves more complexity in implementation compared to simpler algorithms like Bubble Sort or Insertion
Sort.
• Best Case: O(nlogn)
• Average Case: O(nlogn)
• Worst Case: O(nlogn)
• A max-heap is a binary tree where the parent node is greater than or equal to its children.
ALGORITHM & EXAMPLES
algorithm HeapSort(list)
2) Pre: list ≠ ∅
3) Post: list has been sorted into values of ascending order
4) BuildMaxHeap(list)
5) for i ← list.Count - 1 to 1 step -1
6) Swap(list[0], list[i]) // Swap the root (max) with the last element
7) Heapify(list, 0, i) // Rebuild the heap for the reduced heap
8) end for
9) return list
10) end HeapSort
algorithm BuildMaxHeap(list)
1) for i ← list.Count / 2 - 1 down to 0
2) Heapify(list, i, list.Count)
3) end for
4) end BuildMaxHeap
algorithm Heapify(list, i, heapSize)
1) largest ← i
2) left ← 2 * i + 1
3) right ← 2 * i + 2
4) if left < heapSize and list[left] > list[largest]
5) largest ← left
6) end if
7) if right < heapSize and list[right] > list[largest]
8) largest ← right
9) end if
10) if largest ≠ i
11) Swap(list[i], list[largest])
12) Heapify(list, largest, heapSize) // Recursively ensure the heap property
13) end if
14) end Heapify
COMPARISON WITH OTHER SORTING ALGORITHM
REAL LIFE USE CASES BY INDUSTRIES
SUMMARY
 Sorting algorithms organize data into a specific order, with various methods suited to different
needs:
 Bubble Sort: Simple but slow; great for small datasets.
 Insertion Sort: Efficient for small or nearly sorted data.
 Selection Sort: Memory-efficient but slow; ideal for constrained environments.
 Merge Sort: Fast and stable, perfect for large datasets and external sorting.
 Quick Sort: Highly efficient in-memory sort; widely used but not stable.
 Shell Sort: A faster variation of Insertion Sort for medium-sized data.
 Heap Sort: Memory-efficient and reliable for large datasets.
 Radix Sort: Non-comparative, ideal for numbers or fixed-length data.
 Applications: E-commerce sorting (Quick Sort), task scheduling (Heap Sort), database
indexing (Merge Sort), and more. Choose based on dataset size, stability needs, and memory
constraints.
THANK YOU
ARIGATO GOZAIMOSU

You might also like