0% found this document useful (0 votes)
10 views

group 4 presentation

Uploaded by

augustineanashe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

group 4 presentation

Uploaded by

augustineanashe
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Group 4 IT

Sorting
Algorithms
Terminology
What is Sorting?

Sorting refers to rearrangement of a given array or list of elements according to a comparison


operator on the elements. The comparison operator is used to decide the new order of elements in
the respective data structure. Sorting means reordering of all the elements either in ascending or in
descending order.

Sorting Terminology

• In-place Sorting: An in-place sorting algorithm uses constant space for producing the output
(modifies the given array only) or copying elements to a temporary storage. Examples: Selection
Sort, Bubble Sort Insertion Sort and Heap Sort.

• Internal Sorting: Internal Sorting is when all the data is placed in the main

• memory or internal memory. In internal sorting, the problem cannot take input beyond its size.
Example: heap sort, bubble sort, selection sort, quick sort, shell sort, insertion sort.

• External Sorting : External Sorting is when all the data that needs to be sorted cannot be
placed in memory at a time, the sorting is called external sorting. External Sorting is used for the
massive amount of data. Examples: Merge sort, Tag sort, Polyphase sort, Four tape sort, External
radix sort, etc.

• Stable sorting: When two same items appear in the same order in sorted data as in the
original array called stable sort. Examples: Merge Sort, Insertion Sort, Bubble Sort.
CHARACTERISTICS OF SORTING ALGORITHM

• Time Complexity: Time complexity, a measure of how long it takes to run an


algorithm, is used to categorize sorting algorithms. The worst-case, average-
case, and best-case performance of a sorting algorithm can be used to quantify
the time complexity of the process.

• Auxiliary Space : This is the amount of extra space (apart from input array)
needed to sort. For example, Merge Sort requires O(n) and Insertion Sort O(1)
auxiliary space

• Stability: A sorting algorithm is said to be stable if the relative order of equal


elements is preserved after sorting. This is important in certain applications
where the original order of equal elements must be maintained.

• In-Place Sorting: An in-place sorting algorithm is one that does not require
additional memory to sort the data. This is important when the available
memory is limited or when the data cannot be moved.

• Adaptivity: An adaptive sorting algorithm is one that takes advantage of pre-


existing order in the data to improve performance. For example insertion sort
takes time proportional to number of inversions in the input array.
Application of sorting algorithms

• Searching Algorithms: Sorting is often a crucial step in search algorithms like


binary search and Ternary Search. A lot of Greedy Algorithms use sorting as a first
step to apply Greedy Approach. For example Activity Selection, Fractional
Knapsack , Weighted Job Scheduling, etc.

• Data management: Sorting data makes it easier to search, retrieve, and analyze.
For example the order by operation in SQL queries requires sorting.

• Database optimization: Sorting data in databases improves query performance.


We preprocess the data by sorting so that efficient searching can be applied.

• Machine learning: Sorting is used to prepare data for training machine learning
models.

• Data Analysis: Sorting helps in identifying patterns, trends, and outliers in


datasets. It plays a vital role in statistical analysis, financial modeling, and other
data-driven fields.

• Operating Systems: Sorting algorithms are used in operating systems for tasks
Bubble Sort

Overview

• Bubble sort is a simple comparison-based sorting algorithm. It works by


repeatedly stepping through the list to be sorted, comparing each pair of
adjacent items and swapping them if they are in the wrong order. This
process is repeated until the list is sorted.

Algorithm

• Start at the beginning of the list.


• Compare the first two elements. If the first is greater than the second,
swap them.
• Move to the next pair of elements and repeat step 2.
• Continue doing this until you reach the end of the list. This is called a
pass.
• Repeat the process for n-1 passes (where n is the length of the list) or
until no swaps are needed in a pass.
Working of Bubble Sort

Consider that we want to sort a list in ascending order, here are the steps that
the algorithm would follow:

1.Start with the first element.

2.Compare the current element with the next element.

3.If the current element is greater than the next element, then swap both the
elements. If not, move to the next element.

4.Repeat steps 1 – 3 until we get the sorted list.

To better understand bubble sort, recall the list that contains the elements 5,
3, 4, 2 initially.
Example
Pseudocode
function bubbleSort(arr):
n = length(arr)
for i = 0 to n-1:
for j = 0 to n-i-2:
if arr[j] > arr[j+1]:
swap(arr[j], arr[j+1])
Time Complexity
• The time complexity of bubble sort in the best-case scenario is O(n)
when the list is already sorted. In the average and worst-case scenarios,
it is O(n²) due to the nested loops.

Space Complexity
• Bubble sort is an in-place sorting algorithm, so the space complexity is
O(1).

Stability
• Bubble sort is a stable sorting algorithm since it does not change the
relative order of equal elements.
Optimized Bubble Sort Algorithm

Imagine the case where the list is already sorted. For example, our input list contains 2, 3, 4, 5 instead of 5, 3,
4, 2.
In the above algorithm, the loop would still run to compare all the elements. It might cause complex issues like
longer execution times.

To tackle this, we can do the following:

> Create a flag variable, called swapped.

> The value of the swap is set to true if, during any iteration, swapping was done.

> Else, the value of the swap is set to false.

> After an iteration, if the value of swapped is found to be false, it means the array is sorted, and no more
comparisons are required.

> We then stop the execution.

It will help in reducing the number of comparisons, and hence the execution time of the algorithm.
Pseudocode

 begin bubbleSort (array)


 size = length of array;
 for i = 0 to loop-1:
 swapped = false
 for j = 0 to loop - i - 1:
 /* compare the adjacent elements */
 if list[j] > list[j+1] then
 /* swap them */
 swap( list[j], list[j+1] )
 swapped = true
 end if
 end for
 /*if no number was swapped that means
 array is sorted now, break the loop.*/
 if(not swapped) then
 break
 end if
 end for
 end bubbleSort
MERGE SORT

Overview

Merge sort is a divide-and-conquer algorithm that divides the array or list


into equal halves until each sub-list contains a single element, and then
merges those sub-problems in a manner that results in a sorted list or
array.

Algorithm

1. Divide the unsorted list or array into n sub-problems, each containing


one element (a list of one element is considered sorted).
2. Repeatedly merge sub-problems to produce new sorted sub-problems
until there is only one sub-problems remaining. This will be the sorted list.
Example
Pseudocode
function mergeSort(arr):
if length(arr) > 1:
mid = length(arr) // 2
leftHalf = arr[:mid]
rightHalf = arr[mid:]

mergeSort(leftHalf)
mergeSort(rightHalf)

i=j=k=0

while i < length(leftHalf) and j < length(rightHalf):


if leftHalf[i] < rightHalf[j]:
arr[k] = leftHalf[i]
i += 1
else:
arr[k] = rightHalf[j]
j += 1
k += 1
while i < length(leftHalf):
arr[k] = leftHalf[i]
i += 1
k += 1

while j < length(rightHalf):


arr[k] = rightHalf[j]
j += 1
k += 1
Time Complexity

• The time complexity of merge sort is O(n log n) in all cases (best,
average, and worst) since it divides the list into two halves and
merges them.

Space Complexity

• Merge sort requires O(n) additional space as it needs temporary


arrays to store the sublists.

Stability

• Merge sort is a stable sorting algorithm as it maintains the relative


order of equal elements.
Thank You!

You might also like