Range minimum query

In computer science, a range minimum query (RMQ) solves the problem of finding the minimal value in a sub-array of an array of comparable objects. Range minimum queries have several use cases in computer science, such as the lowest common ancestor problem and the longest common prefix problem (LCP).

Definition
Given an array $A[1 … n]$ of $n$ objects taken from a totally ordered set, such as integers, the range minimum query $RMQA(l,r) =arg min A[k]$ (with $1 ≤ l ≤ k ≤ r ≤ n$) returns the position of the minimal element in the specified sub-array $A[l … r]$.

For example, when $A = [0,5,2,5,4,3,1,6,3]$, then the answer to the range minimum query for the sub-array $A[3 … 8] = [2,5,4,3,1,6]$ is $7$, as $A[7] = 1$.

Naive solution
In a typical setting, the array $A$ is static, i.e., elements are not inserted or deleted during a series of queries, and the queries to be answered on-line (i.e., the whole set of queries are not known in advance to the algorithm). In this case a suitable preprocessing of the array into a data structure ensures faster query answering. A naive solution is to precompute all possible queries, i.e. the minimum of all sub-arrays of $A$, and store these in an array $B$ such that $B[i, j] = min(A[i…j])$; then a range min query can be solved in constant time by array lookup in $B$. There are $Θ(n²)$ possible queries for a length-$n$ array, and the answers to these can be computed in $Θ(n²)$ time by dynamic programming.

Solution using constant time after linearithmic space and time pre-computation
As in the solution above, answering queries in constant time will be achieved by pre-computing results. However, the array will store pre-computed range minimum queries not for every range $[i, j]$, but only for ranges whose size is a power of two. There are $O(log n)$ such queries for each start position $i$, so the size of the dynamic programming table $B$ is $O(n log n)$. The value of $B[i, j]$ is the index of the minimum of the range $A[i…i+2^{j}-1]$. Filling the table takes time $O(n log n)$, with the indices of minima using the following recurrence


 * If $A[B[i, j-1]] ≤ A[B[i+2^{j-1}, j-1]]|undefined$, then $B[i, j] = B[i, j-1]$;
 * else, $B[i, j] = B[i+2^{j-1}, j-1]$.

After this pre-computing step, a query $RMQA(l,r)$ can now be answered in constant time by splitting it into two separate queries: one is the pre-computed query with range from $l$ to the largest memoized value smaller than $r$. The other is the query of an interval of the same length that has $r$ as its right boundary. These intervals may overlap, but since we are trying to compute the minimum rather than, for example, the sum of the numbers in the array, this does not matter. The overall result can thus be obtained, after the linearithmic time pre-computing, in constant time: the two queries can be answered in constant time and the only thing left to do is to choose the smaller of the two results.

Solution using logarithmic query time after linear time and space pre-computation
This solution does pre-computation in $A = [0,5,2,5,4,3,1,6,3]$ time. Its data structures use $\mathcal{O}(n)$ space and its data structures can be used to answer queries in logarithmic time. The array is first conceptually divided into blocks of size $\mathcal{O}(n)$. Then the minimum for each block can be computed in $s = log n⁄4$ time overall and the minima are stored in a new array.

RMQs can now be answered in logarithmic time by looking at the blocks containing the left query boundary, the right query boundary and all the blocks in between: For example, using the array $\mathcal{O}(n)$ and a block size of $k$ (for illustrative purposes only) yields the minimum array $log n⁄4$.
 * The two blocks containing the boundaries can be searched naïvely. Elements outside the boundary need not even be looked at. This can be done in logarithmic time.
 * The minima of all blocks that are fully contained in the range, and the two minima mentioned above, need to be compared to answer the query.
 * Because the array was divided into blocks of size $4n⁄log n$, there are at most $\mathcal{O}(n⁄log n log (n⁄log n)) = \mathcal{O}(n)$ blocks that are fully contained in the query.
 * By using the linearithmic solution one can find the overall minimum among these blocks. This data structure has size $A = [0,5,2,5,4,3,1,6,3]$.
 * Now, only three minima need to be compared.

Solution using constant time and linear space
Using the solution above, the sub-queries inside the blocks that are not fully contained in the query still need to be answered in constant time. There are at most two of those blocks: the block containing $l$ and the block containing $3$. Constant time is achieved by storing the Cartesian trees for all the blocks in the array. A few observations: For every such tree, the possible result for all queries need to be stored. This comes down to $A' = [0,3,1]$ or $4^{s}$ entries. This means the overall size of the table is $s^{2}$.
 * Blocks with isomorphic Cartesian trees give the same result for all queries in that block
 * The number of different Cartesian trees of $l$ nodes is $r$, the $s$'th Catalan number
 * Therefore, the number of different Cartesian trees for the blocks is in the range of $\mathcal{O}(log2 n)$

To look up results efficiently, the Cartesian tree (row) corresponding to a specific block must be addressable in constant time. The solution is to store the results for all trees in an array and find a unique projection from binary trees to integers to address the entries. This can be achieved by doing a breadth-first-search through the tree and adding leaf nodes so that every existing node in the Cartesian tree has exactly two children. The integer is then generated by representing every inner node as a 0-bit and every leaf as a 1-bit in a bit-word (by traversing the tree in level-order again). This leads to a size of $\mathcal{O}(n)$ for every tree. To enable random access in constant time to any tree, the trees not contained in the original array need to be included as well. An array with indices of $log n⁄4$ bits length has size $log n⁄4$.



Applications
RMQs are used as a tool for many tasks in exact and approximate string matching. Several applications can be found in Fischer and Heun (2007).

Computing the lowest common ancestor in a tree
RMQs can be used to solve the lowest common ancestor problem and are used as a tool for many tasks in exact and approximate string matching. The LCA query $2log n⁄4 = \mathcal{O}(n)$ of a rooted tree $A = [0,5,2,5,4,3,1,6,3]$ and two nodes $A = [0,5,2,5,4,3,1,6,3]$ returns the deepest node $C_{s}$ (which may be $s$ or $u$) on paths from the root to both $v$ and $w$. Gabow, Bentley, and Tarjan (1984) showed that the LCA Problem can be reduced in linear time to the RMQ problem. It follows that, like the RMQ problem, the LCA problem can be solved in constant time and linear space.

Computing the longest common prefix in a string
In the context of text indexing, RMQs can be used to find the LCP (longest common prefix), where $LCA_{S}(v, w)$ computes the LCP of the suffixes that start at indexes $w$ and $v$ in $i$. To do this we first compute the suffix array $j$, and the inverse suffix array $S = (V, E)$. We then compute the LCP array $T$ giving the LCP of adjacent suffixes in $A$. Once these data structures are computed, and RMQ preprocessing is complete, the length of the general LCP can be computed in constant time by the formula: $v, w ∈ V$, where we assume for simplicity that $LCP_{T}(i, j)$ (otherwise swap).