Partial sorting

In computer science, partial sorting is a relaxed variant of the sorting problem. Total sorting is the problem of returning a list of items such that its elements all appear in order, while partial sorting is returning a list of the k smallest (or k largest) elements in order. The other elements (above the k smallest ones) may also be sorted, as in an in-place partial sort, or may be discarded, which is common in streaming partial sorts. A common practical example of partial sorting is computing the "Top 100" of some list.

In terms of indices, in a partially sorted list, for every index i from 1 to k, the i-th element is in the same place as it would be in the fully sorted list: element i of the partially sorted list contains order statistic i of the input list.

Heap-based solution
Heaps admit a simple single-pass partial sort when $k$ is fixed: insert the first $k$ elements of the input into a max-heap. Then make one pass over the remaining elements, add each to the heap in turn, and remove the largest element. Each insertion operation takes $O(log k)$ time, resulting in $O(n log k)$ time overall; this "partial heapsort" algorithm is practical for small values of $k$ and in online settings. An "online heapselect" algorithm described below, based on a min-heap, takes $O(n + k log n)$.

Solution by partitioning selection
A further relaxation requiring only a list of the $k$ smallest elements, but without requiring that these be ordered, makes the problem equivalent to partition-based selection; the original partial sorting problem can be solved by such a selection algorithm to obtain an array where the first $k$ elements are the $k$ smallest, and sorting these, at a total cost of $O(n + k log k)$ operations. A popular choice to implement this algorithm scheme is to combine quickselect and quicksort; the result is sometimes called "quickselsort".

Common in current (as of 2022) C++ STL implementations is a pass of heapselect for a list of k elements, followed by a heapsort for the final result.

Specialised sorting algorithms
More efficient than the aforementioned are specialized partial sorting algorithms based on mergesort and quicksort. In the quicksort variant, there is no need to recursively sort partitions which only contain elements that would fall after the $k$'th place in the final sorted array (starting from the "left" boundary). Thus, if the pivot falls in position $k$ or later, we recurse only on the left partition:

function partial_quicksort(A, i, j, k) is if i < j then p ← pivot(A, i, j)        p ← partition(A, i, j, p)         partial_quicksort(A, i, p-1, k)         if p < k-1 then partial_quicksort(A, p+1, j, k)

The resulting algorithm is called partial quicksort and requires an expected time of only $O(n + k log k)$, and is quite efficient in practice, especially if a selection sort is used as a base case when $k$ becomes small relative to $n$. However, the worst-case time complexity is still very bad, in the case of a bad pivot selection. Pivot selection along the lines of the worst-case linear time selection algorithm (see ) could be used to get better worst-case performance. Partial quicksort, quickselect (including the multiple variant), and quicksort can all be generalized into what is known as a chunksort.

Incremental sorting
Incremental sorting is a version of the partial sorting problem where the input is given up front but $k$ is unknown: given a $k$-sorted array, it should be possible to extend the partially sorted part so that the array becomes $(k+1)$-sorted.

Heaps lead to an $O(n + k log n)$ "online heapselect" solution to incremental partial sorting: first "heapify", in linear time, the complete input array to produce a min-heap. Then extract the minimum of the heap $k$ times.

A different incremental sort can be obtained by modifying quickselect. The version due to Paredes and Navarro maintains a stack of pivots across calls, so that incremental sorting can be accomplished by repeatedly requesting the smallest item of an array $A$ from the following algorithm:

Algorithm $IQS(A : array, i : integer, S : stack)$ returns the $i$'th smallest element in $A$
 * If $i = top(S)$:
 * Pop $S$
 * Return $A[i]$
 * Let $pivot ← random [i, top(S))$
 * Update $pivot ← partition(A[i : top(S)), A[pivot])$
 * Push $pivot$ onto $S$
 * Return $IQS(A, i, S)$

The stack $S$ is initialized to contain only the length $n$ of $A$. $k$-sorting the array is done by calling $IQS(A, i, S)$ for $i = 0, 1, 2, ...$; this sequence of calls has average-case complexity $O(n + k log k)$, which is asymptotically equivalent to $O(n + k log n)$. The worst-case time is quadratic, but this can be fixed by replacing the random pivot selection by the median of medians algorithm.

Language/library support

 * The C++ standard specifies a library function called.
 * The Python standard library includes functions  and   in its   module.
 * The Julia standard library includes a  algorithm used in   and variants.