Row and column vectors

In linear algebra, a column vector with $m$ elements is an $$m \times 1$$ matrix consisting of a single column of $m$ entries, for example, $$\boldsymbol{x} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}.$$

Similarly, a row vector is a $$1 \times n$$ matrix for some $n$, consisting of a single row of $n$ entries, $$\boldsymbol a = \begin{bmatrix} a_1 & a_2 & \dots & a_n \end{bmatrix}. $$ (Throughout this article, boldface is used for both row and column vectors.) The transpose (indicated by $T$) of any row vector is a column vector, and the transpose of any column vector is a row vector: $$\begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}$$ and $$\begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{bmatrix}^{\rm T} = \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}.$$

The set of all row vectors with $n$ entries in a given field (such as the real numbers) forms an $n$-dimensional vector space; similarly, the set of all column vectors with $m$ entries forms an $m$-dimensional vector space.

The space of row vectors with $n$ entries can be regarded as the dual space of the space of column vectors with $n$ entries, since any linear functional on the space of column vectors can be represented as the left-multiplication of a unique row vector.

Notation
To simplify writing column vectors in-line with other text, sometimes they are written as row vectors with the transpose operation applied to them.

$$\boldsymbol{x} = \begin{bmatrix} x_1 \; x_2 \; \dots \; x_m \end{bmatrix}^{\rm T}$$

or

$$\boldsymbol{x} = \begin{bmatrix} x_1, x_2, \dots, x_m \end{bmatrix}^{\rm T}$$

Some authors also use the convention of writing both column vectors and row vectors as rows, but separating row vector elements with commas and column vector elements with semicolons (see alternative notation 2 in the table below).

Operations
Matrix multiplication involves the action of multiplying each row vector of one matrix by each column vector of another matrix.

The dot product of two column vectors $a, b$, considered as elements of a coordinate space, is equal to the matrix product of the transpose of $a$ with $b$,

$$\mathbf{a} \cdot \mathbf{b} = \mathbf{a}^\intercal \mathbf{b} = \begin{bmatrix} a_1 & \cdots  & a_n \end{bmatrix} \begin{bmatrix} b_1 \\ \vdots \\ b_n \end{bmatrix} = a_1 b_1 + \cdots + a_n b_n \,, $$

By the symmetry of the dot product, the dot product of two column vectors $a, b$ is also equal to the matrix product of the transpose of $b$ with $a$,

$$\mathbf{b} \cdot \mathbf{a} = \mathbf{b}^\intercal \mathbf{a} = \begin{bmatrix} b_1 & \cdots  & b_n \end{bmatrix}\begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix} = a_1 b_1 + \cdots + a_n b_n\,. $$

The matrix product of a column and a row vector gives the outer product of two vectors $a, b$, an example of the more general tensor product. The matrix product of the column vector representation of $a$ and the row vector representation of $b$ gives the components of their dyadic product,

$$\mathbf{a} \otimes \mathbf{b} = \mathbf{a} \mathbf{b}^\intercal = \begin{bmatrix} a_1 \\ a_2 \\ a_3 \end{bmatrix}\begin{bmatrix} b_1 & b_2 & b_3 \end{bmatrix} = \begin{bmatrix} a_1 b_1 & a_1 b_2 & a_1 b_3 \\ a_2 b_1 & a_2 b_2 & a_2 b_3 \\ a_3 b_1 & a_3 b_2 & a_3 b_3 \\ \end{bmatrix} \,, $$

which is the transpose of the matrix product of the column vector representation of $b$ and the row vector representation of $a$,

$$\mathbf{b} \otimes \mathbf{a} = \mathbf{b} \mathbf{a}^\intercal = \begin{bmatrix} b_1 \\ b_2 \\ b_3 \end{bmatrix}\begin{bmatrix} a_1 & a_2 & a_3 \end{bmatrix} = \begin{bmatrix} b_1 a_1 & b_1 a_2 & b_1 a_3 \\ b_2 a_1 & b_2 a_2 & b_2 a_3 \\ b_3 a_1 & b_3 a_2 & b_3 a_3 \\ \end{bmatrix} \,. $$

Matrix transformations
An $n × n$ matrix $M$ can represent a linear map and act on row and column vectors as the linear map's transformation matrix. For a row vector $v$, the product $vM$ is another row vector $p$:

$$\mathbf{v} M = \mathbf{p} \,.$$

Another $n × n$ matrix $Q$ can act on $p$,

$$ \mathbf{p} Q = \mathbf{t} \,. $$

Then one can write $t = pQ = vMQ$, so the matrix product transformation $MQ$ maps $v$ directly to $t$. Continuing with row vectors, matrix transformations further reconfiguring $n$-space can be applied to the right of previous outputs.

When a column vector is transformed to another column vector under an $n × n$ matrix action, the operation occurs to the left,

$$ \mathbf{p}^\mathrm{T} = M \mathbf{v}^\mathrm{T} \,,\quad \mathbf{t}^\mathrm{T} = Q \mathbf{p}^\mathrm{T},$$

leading to the algebraic expression $QM v^{T}$ for the composed output from $v^{T}$ input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation.