0% found this document useful (0 votes)
13 views

Data Structures

Quicksort is a divide and conquer algorithm that works by selecting a pivot element and partitioning the array into two subarrays of elements less than and greater than the pivot. It has average time complexity of O(n log n) but worst case of O(n^2). Merge sort also uses divide and conquer, recursively sorting subarrays until single elements remain, then merging the sorted subarrays. It has predictable O(n log n) time complexity and is stable. Heap sort builds a max heap from the array and repeatedly extracts the maximum element, having time complexity of always O(n log n) and being in-place but not stable.

Uploaded by

nicanicsraakin
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Data Structures

Quicksort is a divide and conquer algorithm that works by selecting a pivot element and partitioning the array into two subarrays of elements less than and greater than the pivot. It has average time complexity of O(n log n) but worst case of O(n^2). Merge sort also uses divide and conquer, recursively sorting subarrays until single elements remain, then merging the sorted subarrays. It has predictable O(n log n) time complexity and is stable. Heap sort builds a max heap from the array and repeatedly extracts the maximum element, having time complexity of always O(n log n) and being in-place but not stable.

Uploaded by

nicanicsraakin
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Kivron Shem Uy BSIT 2 - A

FINAL EXAMINATION

Quick Sort
- QuickSort is a well-known sorting algorithm used in computer science and information
technology to put items in a data structure—usually an array or list—in a certain order. It
is a divide-and-conquer sorting algorithm that is effective and based on comparisons.

How Quick Sorts works:


1. Divide: Choose a "pivot" element from the array. Partition the other elements into two
sub-arrays: elements less than the pivot and elements greater than the pivot.
2. Conquer: Recursively sort the sub-arrays.
3. Combine: Combine the sorted sub-arrays and the pivot back into a single sorted
array.

Advantages:
 Efficiency
- Among the quickest sorting algorithms for big datasets, QuickSort has an average time
complexity of O(n log n).
- In practice, it performs better than a lot of other sorting algorithms, such as bubble sort
and insertion sort.
 In-Place Sorting
- As an in-place sorting algorithm, QuickSort eliminates the need for extra RAM for
auxiliary data structures. It improves memory efficiency by sorting the array in place.
 Cache Efficiency
- QuickSort often exhibits good cache performance because of its sequential and
localized access patterns during the partitioning phase.

Disadvantages:
 Worst-Case Time Complexity
- In the worst-case scenario, where pivot selection continually results in imbalanced
partitions, QuickSort's temporal complexity is O(n^2). If the input array is sorted, or
almost sorted, this may occur.
 Non-Stable Sort
- QuickSort is not a stable sorting algorithm, the sorted result may not contain equal
components in the original order.
 Pivot Sensitivity
- The pivot that is selected can have an impact on QuickSort's efficiency. Inadequate
pivot decisions might result in less than ideal performance, particularly when dealing with
fixed or sorted datasets.

Merged Sort
- Another well-liked and effective sorting algorithm with a solid track record of
dependability is merge sort. The divide-and-conquer tactic is used to sort a list or array.

How Merged Sorts works:


1. Divide: The unsorted array is divided into two halves.
2. Conquer: Each of the two halves is recursively sorted. This process continues until
the base case is reached, where a sub-array has zero or one element (as such arrays
are inherently sorted).
3. Merge: The sorted sub-arrays are then merged to produce a new sorted array. This
involves comparing elements from the two sub-arrays and merging them in ascending
order.
Advantages:
 Stable Sorting
- Merge Sort is a stable sorting algorithm, the sorted output retains the original order of
equal elements. This is crucial in situations where maintaining the original order of equal
items is required.
 Predictable Performance
- Merge Sort ensures that, even in the worst-case situation, the time complexity will
always be O(n log n). It is a dependable option for sorting big datasets because of its
predictability.
 Well-Suited for Linked Lists
- Merge Sort's divide-and-conquer strategy, which is effective with linked data structures,
makes it a good choice for linked lists.

Disadvantages;
 Additional Memory Requirement
- For the merging process, Merge Sort needs more RAM. This could be problematic in
settings where memory is a scarce resource. There are more intricate in-place Merge
Sort variants.
 Slower on Small Datasets
- For small datasets, Merge Sort may be a little slower than other algorithms due to its
possible higher constant factors in time complexity. Simpler methods, such as insertion
sort, might be more effective for small arrays.
 Not Adaptive
- The properties of the input data are not taken into account by Merge Sort. Regardless
of the original sequence of pieces, it works equally well or poorly, which may be
important to take into account in some situations.

Heap Sort
- Heap Sort is a comparison-based sorting method that builds a partially ordered binary
tree (heap) using a binary heap data structure, after which the elements are efficiently
sorted. Large datasets can benefit from its O(n log n) time complexity in all scenarios.

How Heap Sort works:


1. Build a Max Heap:
- Convert the unsorted array into a max heap, where the value of each node is greater
than or equal to the values of its children.
- This involves starting from the last non-leaf node and repeatedly heapifying the
subtrees until the entire array is a valid max heap.
2. Extract the Maximum (Heapify Down):
- After building the max heap, repeatedly extract the maximum element from the heap
(which is the root of the heap).
- Swap the root with the last element in the heap and reduce the size of the heap by one.
Heapify down the root to maintain the max heap property.
3. Repeat:
- Repeat the extraction and heapify process until the heap is empty.
- The extracted elements, when placed in the array in reverse order, form a sorted
sequence.
Advantages:
 Time Complexity
- Heap Sort is effective for handling big datasets since its time complexity is always O(n
log n). This makes it appropriate for situations requiring a consistent and dependable
sorting method.
 In-Place Sorting
- Heap sorting uses in-place sorting, it doesn't need extra RAM for auxiliary data
structures. It improves memory efficiency by sorting the array in place.
 Not Sensitive to Initial Order
- In contrast to certain sorting algorithms (like bubble sort), Heap Sort is insensitive to
the array's elemental order at first. Its performance is constant irrespective of the
distribution of inputs.

Disadvantages:
 Not Stable
- The sorting algorithm known as heap sort is unstable. In the sorted output, equal
elements might not stay in the same order.
 Less Cache-Friendly
- Heap Sort's memory access patterns are less cache-friendly than those of several
other algorithms, such as Merge Sort. Performance may suffer a little as a result,
particularly on systems with little cache.
 Not Adaptive
- The properties of the input data are not taken into account by Heap Sort. It works
independent of the elements' original arrangement, either very effectively or very poorly.

Out of the 3, which will you use and why?


- I will use the Heap Sort among the 3 because it is a dependable and user-friendly
sorting algorithm for a range of situations. Its consistent O(n log n) time complexity, in-
place sorting, and insensitivity to the element's beginning order are among its salient
characteristics. Because of its deterministic behavior and ability to adapt to various data
structures, heap sort is a good option for situations where memory efficiency and
consistency of performance are important considerations.

You might also like