0% found this document useful (0 votes)
21 views17 pages

CHAPTER 4 - ALGORITHMS IN C -SORTING

Chapter 4 discusses various sorting algorithms used in C/C++, explaining their mechanisms and complexities. Key algorithms include Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, and Quick Sort, each with distinct advantages and time complexities. The chapter emphasizes the importance of choosing the right sorting algorithm based on data type and problem size.

Uploaded by

tai01112006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views17 pages

CHAPTER 4 - ALGORITHMS IN C -SORTING

Chapter 4 discusses various sorting algorithms used in C/C++, explaining their mechanisms and complexities. Key algorithms include Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, and Quick Sort, each with distinct advantages and time complexities. The chapter emphasizes the importance of choosing the right sorting algorithm based on data type and problem size.

Uploaded by

tai01112006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

CHAPTER 4: ALGORITHMS IN

C/C++
D ATA S T R U C T U R E a n d A L G O R I T H M S

M.E. LE THANH TUNG


4 . 3 S O RT I N G A L G O R I T H M S :

⚬ Sorting?
⚬ Sorting is a very classic problem of reordering items (that can
be compared, e.g., integers, floating-point numbers, strings, etc)
of an array (or a list) in a certain order (increasing, non-
decreasing (increasing or flat).
⚬ Sorting algorithms are a set of techniques used to rearrange
the elements of a list or array into a particular order.
⚬ There are numerous sorting algorithms, each with its own
advantages and disadvantages in terms of efficiency, ease of
implementation, and suitability for different types of data and
problem sizes.
4 . 3 S O RT I N G A L G O R I T H M S :

⚬ Sorting algorithms:
⚬ Bubble Sort (Time complexity: O(n^2)): This brute-force
algorithm repeatedly steps through the list, compares adjacent
elements, and swaps them if they are in the wrong order.
⚬ Selection Sort (Time complexity: O(n^2)): Another brute-force
algorithms: divides the list into two parts: the sub-list of items
already sorted and the sub-list of items to be sorted.
⚬ Insertion Sort (Time complexity: O(n^2)): Insertion sort builds
the sorted list one element at a time by repeatedly taking the next
element from the unsorted part of the array and inserting it into its
correct position in the sorted part.
4 . 3 S O RT I N G A L G O R I T H M S :

⚬ Merge Sort (Time complexity: O(nlogn)): a divide-and-


conquer algorithm. It divides the input array into two halves,
sorts each half recursively, and then merges the sorted halves to
produce the final sorted array.
⚬ Quick Sort (Time complexity: O(nlogn)): also a divide-and-
conquer algorithm. It selects a 'pivot' element from the array and
partitions the other elements into two sub-arrays. It then recursively
sorts the sub-arrays.
⚬ Heap Sort (Time complexity: O(nlogn)): Heap sort is based on
the binary heap data structure. It first builds a heap from the
input array and then repeatedly extracts the maximum element from
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.1 Bubble sort:


• How bubble sort works:
⚬ It starts at the beginning of the list; compares adjacent elements,
and swaps them if they are in the wrong order.
⚬ It continues this process, and after completing one pass through
the list, the largest element will have "bubbled up" to the end of
the list.
⚬ The algorithm then repeats this process for the remaining elements,
each time stopping one element sooner than the previous pass
because the largest element is already in its correct position.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.1 Bubble sort:


• How bubble sort works:
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.2 Selection sort:


⚬ This sorting algorithm divides
the array into two parts: the
sorted part and the unsorted
part.
⚬ The algorithm repeatedly
selects the smallest (or
largest, depending on the
sorting order) element from the
unsorted part and swaps it
with element at the end of
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.2.3 Insertion sort:


⚬ This is a simple
sorting algorithm
that builds the
final sorted array
one item at a
time.
⚬ Has some
advantages in
simplicity and is
often used in
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.4 Quick sort:


⚬ Quick sort is a popular and efficient sorting algorithm that uses a
divide-and-conquer strategy to sort an array.
⚬ It works by selecting a "pivot" element from the array and
partitioning the other elements into two sub-arrays according to
whether they are less than or greater than the pivot.
⚬ The sub-arrays are then sorted recursively. .
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.4 Quick sort:


• How quick sort works:
⚬ Pivot selection: Select a pivot element
from the list.(first, last, middle, or
randomly chosen).
⚬ Partitioning: Rearrange the elements so
that all elements less than the pivot come
before it, and greater than the pivot come
after it.
⚬ Recursion: Recursively apply the above
steps to the sub-lists formed by the
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.4 Quick sort:


• How quick sort works:
⚬ Recursion: Recursively apply the
above steps to the sub-lists formed by
the partitioning process. This process
continues until the entire list is sorted.
⚬ Combining Sub-lists: As the
recursion unwinds, the sorted sub-lists
are combined to form the final sorted
list.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.4 Quick sort:


⚬ Time Complexity: Average case - O(n log n), worst case -
O(n^2), but with good pivot selection techniques, the worst-case
performance can be mitigated.
⚬ Space Complexity: O(log n) stack space for recursion in the
average case.
⚬ Characteristics: Quick sort is usually faster than Merge Sort in
practice due to its in-place partitioning. However, its performance
can degrade to O(n^2) in the worst case.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.5 Merge sort:


⚬ The Merge Sort algorithm is a divide-and-conquer algorithm
that sorts an array by first breaking it down into smaller arrays, and
then building the array back together the correct way so that it is
sorted.
⚬ Divide: The algorithm starts with breaking up the array into
smaller and smaller pieces until one such sub-array only consists
of one element.
⚬ Conquer: The algorithm merges the small pieces of the array back
together by putting the lowest values first, resulting in a sorted
array.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.5 Merge sort:


• How merge sort works:
⚬ Divide: divide the unsorted
array into two sub-arrays, half
the size of the original.
⚬ Continue to divide the sub-
arrays as long as the current
piece of the array has more than
one element.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.5 Merge sort:


• How merge sort works:
⚬ Merge (Conquer): Merge two
sub-arrays together by always
putting the lowest value first.
⚬ Keep merging until there are no
sub-arrays left.
4 . 3 S O RT I N G A L G O R I T H M S :

• 4.3.5 Merge sort:


⚬ Time Complexity: O(n log n) in all cases (worst, average, and
best). Merge sort has a worst-case time complexity of O(N logN),
which means it performs well even on large datasets.
⚬ Space Complexity: O(n) auxiliary space is required for merging.
Merge sort requires additional memory to store the merged sub-
arrays during the sorting process.
⚬ Characteristics: Merge sort performs well on linked lists and is
stable. It's not an in-place sort, meaning it requires additional
memory. For small datasets, Merge sort has a higher time complexity
than some other sorting algorithms, such as insertion sort.
DATA STRUCTURE &
ALGORITHMS
THANKS YO U

You might also like