Complete orthogonal decomposition

In linear algebra, the complete orthogonal decomposition is a matrix decomposition. It is similar to the singular value decomposition, but typically somewhat cheaper to compute and in particular much cheaper and easier to update when the original matrix is slightly altered.

Specifically, the complete orthogonal decomposition factorizes an arbitrary complex matrix $$A$$ into a product of three matrices, $$A = U T V^*$$, where $$U$$ and $$V^*$$ are unitary matrices and $$T$$ is a triangular matrix. For a matrix $$A$$ of rank $$r$$, the triangular matrix $$T$$ can be chosen such that only its top-left $$r\times r$$ block is nonzero, making the decomposition rank-revealing.

For a matrix of size $$m\times n$$, assuming $$m \ge n$$, the complete orthogonal decomposition requires $$O(mn^2)$$ floating point operations and $$O(m^2)$$ auxiliary memory to compute, similar to other rank-revealing decompositions. Crucially however, if a row/column is added or removed or the matrix is perturbed by a rank-one matrix, its decomposition can be updated in $$O(mn)$$ operations.

Because of its form, $$A = U T V^*$$, the decomposition is also known as UTV decomposition. Depending on whether a left-triangular or right-triangular matrix is used in place of $$T$$, it is also referred to as ULV decomposition or URV decomposition, respectively.

Construction
The UTV decomposition is usually computed by means of a pair of QR decompositions: one QR decomposition is applied to the matrix from the left, which yields $$U$$, another applied from the right, which yields $$V^*$$, which "sandwiches" triangular matrix $$T$$ in the middle.

Let $$A$$ be a $$m\times n$$ matrix of rank $$r$$. One first performs a QR decomposition with column pivoting:


 * $$A\Pi = U \begin{bmatrix} R_{11} & R_{12} \\ 0 & 0 \end{bmatrix}$$,

where $$\Pi$$ is a $$n\times n$$ permutation matrix, $$U$$ is a $$m\times m$$ unitary matrix, $$R_{11}$$ is a $$r\times r$$ upper triangular matrix and $$R_{12}$$ is a $$r\times(n-r)$$ matrix. One then performs another QR decomposition on the adjoint of $$R$$:


 * $$\begin{bmatrix} R^*_{11} \\ R^*_{12}\end{bmatrix}

= V' \begin{bmatrix} T^* \\ 0\end{bmatrix} $$,

where $$V'$$ is a $$n\times n$$ unitary matrix and $$T$$ is an $$r\times r$$ lower (left) triangular matrix. Setting $$V = \Pi V'$$ yields the complete orthogonal (UTV) decomposition:


 * $$A = U \begin{bmatrix} T & 0 \\ 0 & 0 \end{bmatrix} V^*$$.

Since any diagonal matrix is by construction triangular, the singular value decomposition, $$A = USV^*$$, where $$S_{11} \ge S_{22} \ge \ldots \ge 0$$, is a special case of the UTV decomposition. Computing the SVD is slightly more expensive than the UTV decomposition, but has a stronger rank-revealing property.