Some Features
Time Complexity
Typical ones are: O(n^2) or O(nlogn) or O(n), for more check Big-O Cheatsheet.
In-place
Strictly an in-place sort needs only o(1) memory beyond the items being sorted.
So an in place algorithm does not need any extra memory.
Recursive
Some sorting algorithms are implemented in a recursive manner, especially the divide and conquer ones. Such as merge sort and quick sort.
Stable
Stable sorting algorithms maintain the relative order of records with equal values.
Bubble Sort
- Repeatedly steps through the list to be sorted, compares each pair of adjacent items and swaps them if they are in the wrong order
- It is too slow and impractical for most problems even when compared to insertion sort
- Bubble sort has worst-case and average complexity both O(n*n)
- Bubble sort is not a practical sorting algorithm
- It will not be efficient in the case of a reverse-ordered collection
- In computer graphics it is popular for its capability to detect a very small error (like swapping of only two elements) in almost-sorted arrays and fix it with just linear complexity
1 | # Bubble Sort |
Selection Sort
- O(n^2) running time complexity
- Selection sort is noted for its simplicity, and it has performance advantages over more complicated algorithms
- Particularly useful where auxiliary memory is limited
- The Algorithm divides the input list into two parts: the subarray of items already sorted and the subarray of items remaining to be sorted that occupy the rest of the array
- It is an in-place algorithm, no need for extra memory
- Selection sort almost always outperform bubble sort
- Not a stable sort, does not preserve the order of keys with equal values
- Quite counter-intuitive: selection sort and insertion sort are both typically faster for small arrays
- Make less writes than insertion sort, this can be important if writes are significantly more expensive than reads, for example with EEPROM or flash memory where every writes lessens the lifespan of the memory
1 | # Selection Sort |
Selection Sort - GeeksforGeeks
Insertion Sort
- O(n^2) time complexity. It is a simple sorting algorithm that builds the final array one item at a time
- On large datasets it is very inefficient but on arrays with 10 to 20 items it is good
- Simple implementation, it is more efficient than other quadratic running time sorting procedures such as bubble sort and selection sort
- Adaptive algorithm, speed up when array is already substantially sorted
- Stable sort, preserve the order of the items with equal keys
- In-place algorithm, does not need any extra memory
- It is an online algorithm, it can sort an array as it receives it for example downloading data from web
- Hybrid algorithms uses insertion sort if the subarray is small enough
- Variant of insertion sort is shell sort
- Insertion sort requires more writes because the inner loop can require shifting large sections of the sorted portion of the array
- In general, insertion will write to the array O(n^2) times while selection sort will write only O(n) times. For this reason selection sort maybe preferable in cases where writing to memory is significantly more expensive than reading, such as flash memory
1 | # Insertion Sort |
Insertion Sort - GeeksforGeeks
QuickSort
- It is an efficient sorting algorithm
- A well implemented quickSort can outperform heap sort and merge sort -> the main competitors of quickSort
- A comparison based algorithm, not able to be faster than linear time complexity
- The efficient implementation of quickSort is not stable, does not keep the relative order of items with equal value
- It is in-place, does not need any additional memory
- On average case it has O(nlogn) running time
- But the worst case running time is quadratic o(n^2)
- It is widely used in programming languages
- Primitive types -> usually quickSort is used
- Reference types / objects -> usually merge sort is used
- It is divide and conquer algorithm
- pick an element from the array as pivot item
- partition phase: rearrange the array so that all elements with values less than the pivot, while all elements with values greater than the pivot come after it // equal values can go either way
- recursively apply these steps on the subarray
- base case for recursion: arrays of size zero or one never need to be sorted
- Choosing the pivot item
- It is very important, if we keep choosing bad pivots, the running time will be quadratic.
- Option 1: we can choose a pivot at random // usually it works
- Option 2: choosing the middle index of the array as the pivot
1 | # QuickSort |
Merge Sort
- Merge sort is a divide and conquer, stable, and comparison based algorithm
- Merge sort is not a in-place algorithm
- Efficient quickSort implementations generally outperform merge sort
- Merge sort is often the best choice for sorting a linked list: in this situation it is relatively easy to implement a merge sort in such a way that it requires only O(1) extra space
- Process:
- divide the array into two subarrays recursively
- sort these subarrays recursively with merge sort again
- if there is only a single item left in the subarray -> we consider it to be sorted by definition
- merge the subarrays to get the final sorted array
1 | # Merge Sort |
Hybrid Sort
IntroSort
- Also known as introspective sort
- It is a hybrid sorting algorithm that provides both fast average performance and optimal worst-case performance
- It begins with quickSort and switches to heapSort when quickSort becomes slow
- IntroSort = quickSort + heapSort
Timsort
- Timsort = merge sort + insertion sort
- It is a stable sorting algorithm
- It was implemented by Tim Peters in 2002 for use in the Python programming languge
- Best case running time: O(n)
- Worst case running time: O(nlogn)
- Worst case space complexity: O(n)