Two-vector

A two-vector or bivector is a tensor of type $$\scriptstyle\binom{2}{0}$$ and it is the dual of a two-form, meaning that it is a linear functional which maps two-forms to the real numbers (or more generally, to scalars).

The tensor product of a pair of vectors is a two-vector. Then, any two-form can be expressed as a linear combination of tensor products of pairs of vectors, especially a linear combination of tensor products of pairs of basis vectors. If f is a two-vector, then
 * $$ \mathbf{f} = f^{\alpha \beta} \, \vec e_\alpha \otimes \vec e_\beta $$

where the f α β are the components of the two-vector. Notice that both indices of the components are contravariant. This is always the case for two-vectors, by definition. A bivector may operate on a one-form, yielding a vector:
 * $$ f^{\alpha \beta} u_{\beta} = v^{\alpha}$$,

although a problem might be which of the upper indices of the bivector to contract with. (This problem does not arise with mixed tensors because only one of such tensor's indices is upper.) However, if the bivector is symmetric then the choice of index to contract with is indifferent.

An example of a bivector is the stress–energy tensor. Another one is the orthogonal complement of the metric tensor.

Matrix notation
If one assumes that vectors may only be represented as column matrices and covectors as row matrices; then, since a square matrix operating on a column vector must yield a column vector, it follows that square matrices can only represent mixed tensors. However, there is nothing in the abstract algebraic definition of a matrix that says that such assumptions must be made. Then dropping that assumption matrices can be used to represent bivectors as well as two-forms. Example:

$$\begin{pmatrix}f^{00} && f^{01} && f^{02} && f^{03} \\ f^{10} && f^{11} && f^{12} && f^{13} \\ f^{20} && f^{21} && f^{22} && f^{23} \\ f^{30} && f^{31} && f^{32} && f^{33} \end{pmatrix} \begin{pmatrix}u_0 \\ u_1 \\ u_2 \\ u_3\end{pmatrix} = \begin{pmatrix}f^{00} u_0 + f^{01} u_1 + f^{02} u_2 + f^{03} u_3\\ f^{10} u_0 + f^{11} u_1 + f^{12} u_2 + f^{13} u_3\\ f^{20} u_0 + f^{21} u_1 + f^{22} u_2 + f^{23} u_3\\ f^{30} u_0 + f^{31} u_1 + f^{32} u_2 + f^{33} u_3\end{pmatrix} = \begin{pmatrix}v^0 \\ v^1 \\ v^2 \\ v^3\end{pmatrix} \iff f^{\alpha \beta} u_\beta = v^\alpha$$

$$\begin{pmatrix}u_0 && u_1 && u_2 && u_3\end{pmatrix} \begin{pmatrix}f^{00} && f^{01} && f^{02} && f^{03} \\ f^{10} && f^{11} && f^{12} && f^{13} \\ f^{20} && f^{21} && f^{22} && f^{23} \\ f^{30} && f^{31} && f^{32} && f^{33} \end{pmatrix}$$

$$ = \begin{pmatrix}u_0 f^{00} + u_1 f^{10} + u_2 f^{20} + u_3 f^{30} && u_0 f^{01} + u_1 f^{11} + u_2 f^{21} + u_3 f^{31} && u_0 f^{02} + u_1 f^{12} + u_2 f^{22} + u_3 f^{32} && u_0 f^{03} + u_1 f^{13} + u_2 f^{23} + u_3 f^{33}\end{pmatrix}$$

$$ = \begin{pmatrix} w^0 && w^1 && w^2 && w^3\end{pmatrix} \iff u_\alpha f^{\alpha \beta} = f^{\alpha \beta} u_\alpha = w^\beta$$ or $$f^{\beta \alpha} u_\beta = w^\alpha$$.

If f is symmetric, i.e., $$f^{\alpha \beta} = f^{\beta \alpha}$$, then $$v^\alpha = w^\alpha$$.