Sorting
Interview Essentials
Sorting algorithms are used to put the elements of an array in a certain order, generally using a comparison function to determine order (a generalization of less than). Some comparison functions will allow for ties between different objects. A stable sorting algorithm will guarantee that if two elements "tie", then whichever element occured first in the original array will also occur first in the sorted array. An unstable sorting algorithm will only guarantee that the final elements are in order, and makes no guarantee about the order of "tied" elements.
Being asked to implement a sorting algorithm from scratch used to be a frequently asked interview question, but this is becoming less popular. Just in case, though, it's worth knowing how to implement two of the more useful sorting algorithms:
Quick sort
Merge sort
For other sorting algorithms, you might be asked about their time complexity, their space complexity, and whether or not the sort is stable. In decreasing order of probability of coming up in an interview, the other common sorts are:
Insertion sort
Heap sort
Bubble sort (it's usually sufficient to explain why bubble sort is terrible)
Radix sort (the only non-comparison based sort on this list)
Quick Sort
Important Concepts
Quick sort is a comparison sort, which means that it can only be implemented in arrays in which there's a notion of "less than" and "greater than".
If it's implemented correctly, quick sort can be up to 3x as fast as merge sort and heap sort.
Time Complexity
On average, quick sort can sort
n
items inO(n log n)
comparisons. (In the worst case scenario, it can takeO(n2)
, but this is fairly rare. However, if you need to guarantee anO(n log n)
solution in an interview, you should mention other sorting options if you choose to use quick sort.)
Space Complexity
Quick sort works in place, so it has a space complexity of
O(1)
.
Generalized Algorithm
Pick a pivot element from the array.
Reorder the array so that all the elements with values less than the pivot now come before the pivot, while all elements with values greater than the pivot now come after it. Equal values can go either before or after. After this partitioning operation, the pivot element will be in its final position.
Recursively apply these steps to the sub-array of elements with smaller values.
Recursively apply these steps to the sub-array of elements with greater values.
The base case of the recursive algorithm is arrays with a size
0
or1
, which don't need to be sorted.
Merge Sort
Important Concepts
Merge sort is a comparison sort, which means that it can only be implemented in arrays in which there's a notion of "less than" or "greater than".
Time Complexity
The average and worst-case time complexity of merge sort is
O(n log n)
because the input is halved on each loop.
Space Complexity
Merge sort doesn't sort in place. We make a new array to copy into when merging, and at the last step we merge two arrays of size n/2. The amount of additional space we need to copy in to is
O(n)
.
Generalized Algorithm
Merge sort is a recursive algorithm. Suppose the function is called merge(L)
, which returns the sorted array.
Given an array, if its length is bigger than 1, split it into two pieces
left
andright
. (The pieces should be as close to evenly sized as possible). If the array has length 1 or zero, then it is automatically sorted, so just return that trivial array.Call store
mL = merge(left)
andmR = merge(right)
. This will give you a sorted copy of the left and right half of the array.Start with a reference to the beginning of
mL
andmR
, calledref_L
andref_R
respectively. Build up a new arrayans
that merges the contents ofmL
andmR
by taking the smaller ofref_L
andref_R
, and updating the reference taken to the next element. This merge guarantees thatans
is in order from smallest to greatest.Return the merged array
ans
.
Overview of the Rarer Sorting Algorithms
Insertion Sort
An in-place sorting algorithm.
Start with the index
current
at 0. (Insertion sort ensures that all elements up to and including the element at indexcurrent
are in sorted order, which is trivial for the first element).All elements before
current
are sorted, elements aftercurrent
are unsorted. Get the first element aftercurrent
.Find where the next element fits in the sorted portion of the array.
Now the sorted portion of the array is one index larger. Move the
current
index one place further toward the end.
Repeat steps 2 - 4 until the current index is at the end. The resulting array is completely sorted (all elements occur before current
).
Insertion sort does well in cases where the array is already partially sorted, as it can "move past" elements that are already in order.
Heap Sort
An in-place sorting algorithm.
Pass through the array one element at a time, converting the subarray before the current index into a min-heap.
Once the entire array has been converted into a min-heap, use the heap's
extract-min
method, and place the retrieved element at the end of a new array. Since each element is the minimum of the heap at the time of extraction, this process ends with a sorted array.
Selection Sort
An in-place sorting algorithm.
Start with the index
current
at 0. (Selection sort maintains the invariant that the first(current-1)
smallest elements appear in sorted order at the beginning of the array. This invariant is trivial whencurrent
is zero.)Scan the subarray of the elements from
current
to the end of the list. Select the smallest element in the subarray.Swap the element at
current
with the smallest element found in step 2.The first
current
smallest elements are at the beginning of the array. Move the indexcurrent
forward one place.
Repeat steps 2 - 4 until the current
index is at the end of the array. The resulting array is sorted.
A lot of the time in selection sort is spent finding the minimum element in the unsorted portion of the array. Heap sort (discussed above) is essentially a selection sort in which the heap is used to speed up the repeated selection of the minimum with extract-min
.
Bubble Sort
An in-place sorting algorithm. Bubble sort is a very poor sorting algorithm; its use in interviews is limited to talking about why it's a poor choice. Here is how bubble sort would operate on an array a
:
Start by looking at the first two elements,
a[0]
anda[1]
. If they are out of order, swap them.Then compare the two elements,
a[1]
anda[2]
. If they are out of order, swap them.Iterate this procedure, putting items
a[i]
anda[i+1]
in the correct order, until you reach the end. This process of going through the array with up ton-1
swaps is called a pass.Repeat this process until the array is fully sorted, which can take at most
n
passes.
Head-to-Head Table of Comparison-Based Sorting Algorithms
If only one entry is shown for complexity, it means that average and worst-case complexity scale the same way.
Sorting algorithm | Time Complexity (Avg / Worst) | Additional Space Complexity (Avg / Worst) | Deterministic? | Stable? |
Quick Sort |
|
| No | No |
Merge Sort |
|
| Yes | Yes |
Insertion Sort |
| In-place ( | Yes | Yes |
Heap Sort |
|
| Yes | No |
Bubble Sort |
| In-place ( | Yes | Yes |
Selection Sort |
| In-place ( | Yes | Implem. dependent |
(*) Insertion sort is fast if the input is already partially sorted. If no item is more than k
positions from its final position, the runtime is O(nk)
.Example of different sorts
We show the sequence of swaps the different swapping algorithms make to put the array [6,5,3,1,8,7,2,4]
in order.
For insertion and bubble sort, we only included steps where the list changed, to reduce the number of steps shown. Merge sort was excluded because it isn't a simple swap, and quick sort was excluded because it isn't deterministic.
Swap # | Selection Sort | Insertion Sort | Bubble Sort |
0 | [6, 5, 3, 1, 8, 7, 2, 4] | [6, 5, 3, 1, 8, 7, 2, 4] | [6, 5, 3, 1, 8, 7, 2, 4] |
1 | [1, 5, 3, 6, 8, 7, 2, 4] | [5, 6, 3, 1, 8, 7, 2, 4] | [5, 6, 3, 1, 8, 7, 2, 4] |
2 | [1, 2, 3, 6, 8, 7, 5, 4] | [5, 3, 6, 1, 8, 7, 2, 4] | [5, 3, 6, 1, 8, 7, 2, 4] |
3 | [1, 2, 3, 6, 8, 7, 5, 4] | [3, 5, 6, 1, 8, 7, 2, 4] | [5, 3, 1, 6, 8, 7, 2, 4] |
4 | [1, 2, 3, 4, 8, 7, 5, 6] | [3, 5, 1, 6, 8, 7, 2, 4] | [5, 3, 1, 6, 7, 8, 2, 4] |
5 | [1, 2, 3, 4, 5, 7, 8, 6] | [3, 1, 5, 6, 8, 7, 2, 4] | [5, 3, 1, 6, 7, 2, 8, 4] |
6 | [1, 2, 3, 4, 5, 6, 8, 7] | [1, 3, 5, 6, 8, 7, 2, 4] | [5, 3, 1, 6, 7, 2, 4, 8] |
7 | [1, 2, 3, 4, 5, 6, 7, 8] | [1, 3, 5, 6, 7, 8, 2, 4] | [3, 5, 1, 6, 7, 2, 4, 8] |
8 | [1, 3, 5, 6, 7, 2, 8, 4] | [3, 1, 5, 6, 7, 2, 4, 8] | |
9 | [1, 3, 5, 6, 2, 7, 8, 4] | [3, 1, 5, 6, 2, 7, 4, 8] | |
10 | [1, 3, 5, 2, 6, 7, 8, 4] | [3, 1, 5, 6, 2, 4, 7, 8] | |
11 | [1, 3, 2, 5, 6, 7, 8, 4] | [1, 3, 5, 6, 2, 4, 7, 8] | |
12 | [1, 2, 3, 5, 6, 7, 8, 4] | [1, 3, 5, 2, 6, 4, 7, 8] | |
13 | [1, 2, 3, 5, 6, 7, 4, 8] | [1, 3, 5, 2, 4, 6, 7, 8] | |
14 | [1, 2, 3, 5, 6, 4, 7, 8] | [1, 3, 2, 5, 4, 6, 7, 8] | |
15 | [1, 2, 3, 5, 4, 6, 7, 8] | [1, 3, 2, 4, 5, 6, 7, 8] | |
16 | [1, 2, 3, 4, 5, 6, 7, 8] | [1, 2, 3, 4, 5, 6, 7, 8] |
Last updated