Sorting
Sorting
Algorithm Steps:
def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]
swapped = True
if not swapped:
break
return arr
# Example Usage
arr = [64, 34, 25, 12, 22, 11, 90]
sorted_arr = bubble_sort(arr)
print("Sorted array is:", sorted_arr)
Explanation:
o The outer loop runs n times, where n is the length of the list.
o The inner loop runs n-i-1 times for each outer loop iteration.
o Elements are compared and swapped if needed.
o The swapped flag is used to optimize the algorithm by stopping early if no
swaps are made in an inner loop iteration.
Complexity Analysis
Time Complexity:
o Worst Case: O(n^2)
Occurs when the list is in reverse order.
Each element needs to be compared with every other element.
o Average Case: O(n^2)
Comparisons and swaps are performed on average, leading to quadratic
time complexity.
o Best Case: O(n)
Occurs when the list is already sorted.
Only one pass is needed to confirm the list is sorted.
Advantages:
Disadvantages:
Algorithm Steps:
Explanation:
o Start from the second element (index 1) and assume the first element is sorted.
o For each element, compare it with elements in the sorted part.
o Shift elements in the sorted part to the right if they are greater than the current
element.
o Insert the current element into its correct position.
Complexity Analysis
Time Complexity:
o Worst Case: O(n^2)
Occurs when the list is in reverse order.
Each element needs to be compared with every other element in the
sorted part.
o Average Case: O(n^2)
Comparisons and shifts are performed on average, leading to quadratic
time complexity.
o Best Case: O(n)
Occurs when the list is already sorted.
Only one pass is needed to confirm the list is sorted, with no shifting
required.
Advantages:
o Simple and easy to implement.
o Efficient for small datasets and nearly sorted lists.
o Stable sort: does not change the relative order of equal elements.
o In-place sort: requires only a small amount of additional memory.
Disadvantages:
Algorithm Steps
1. Outer Loop: Iterate over the array from the first to the second-last element (index i
from 0 to n-2).
2. Find Minimum: For each i, find the minimum element in the subarray from A[i] to
A[n-1].
3. Swap: Swap the found minimum element with A[i].
Example
Complexity Analysis
1. Time Complexity:
o Best Case: O(n2)
o Average Case: O(n2)
o Worst Case: O(n2)
o The time complexity is dominated by the nested loops: the outer loop runs n−1
times, and the inner loop runs an average of n/2 times.
Advantages:
Disadvantages:
Heapsort
1. Build a Max-Heap: Transform the array into a max-heap, a complete binary tree
where the value of each node is greater than or equal to the values of its children.
2. Extract Maximum Element: Swap the root of the max-heap with the last element of
the heap and reduce the heap size by one. Restore the max-heap property by
heapifying the root.
3. Repeat: Repeat the extraction process until the heap size is reduced to one
Algorithm Steps
1. Build Max-Heap:
oStart from the last non-leaf node and heapify each node up to the root.
2. Heapsort:
o Swap the root (maximum value) with the last element of the heap.
o Reduce the heap size by one.
o Heapify the root to restore the max-heap property.
o Repeat the process for the remaining elements.
Pseudocode
function heapsort(A: array of n items)
buildMaxHeap(A)
for i = n-1 to 1 do
swap A[0] with A[i]
heapSize = heapSize - 1
maxHeapify(A, 0)
end function
Complexity Analysis
1. Time Complexity:
o Building the Max-Heap: O(n)
Each element is heapified once, and the heapify operation takes
O(logn) time.
o Heapsort Process: O(nlogn)
Each of the n elements is extracted from the heap, and each extraction
involves O(logn) time for heapifying the root.
So, the merge sort working rule involves the following steps:
1. Divide the unsorted array into subarray, each containing a single element.
2. Take adjacent pairs of two single-element array and merge them to form an array of 2
elements.
3. Repeat the process till a single sorted array is obtained.
ALGORITHM-MERGE SORT
If p<r
Then q → ( p+ r)/2
MERGE-SORT (A, p, q)
MERGE-SORT ( A, q+1,r)
MERGE ( A, p, q, r)
k = k + 1
j = j + 1
Time complexity can be calculated based on the number of split operations and the number of
merge operations:
O((n−1) + n⋅log2n)=O(n⋅log2 n)
The number of splitting operations (n−1) can be removed from the Big O calculation above
because n⋅log2n will dominate for large n.