User:Aravind V R/Formulas/Linear Algebra

Basic
A square matrix $A$ with entries $$a_{ij}$$ is called
 * Hermitian or self-adjoint if $A = A^{∗}$, i.e., $$a_{ij}=\overline{a_{ji}}$$.
 * skew Hermitian or antihermitian if $A = −A^{∗}$, i.e., $$a_{ij}=-\overline{a_{ji}}$$.
 * normal if $A^{∗}A = AA^{∗}$.
 * unitary if $A^{∗} = A^{−1}$.

More
A matrix is called
 * Toeplitz matrix or diagonal-constant matrix, if each descending diagonal from left to right is constant. It is not necessarily square. e.g.:



\begin{bmatrix} a & b & c & d & e \\ f & a & b & c & d \\ g & f & a & b & c \\ h & g & f & a & b \\ i & h & g & f & a \end{bmatrix}. $$
 * Hankel matrix (or catalecticant matrix), if it is a square matrix in which each ascending skew-diagonal from left to right is constant, e.g.:


 * $$\begin{bmatrix}

a & b & c & d & e \\ b & c & d & e & f \\ c & d & e & f & g \\ d & e & f & g & h \\ e & f & g & h & i \\ \end{bmatrix}.$$

Eigenvalue relationships

 * $$\operatorname{tr}(A) = \sum_i \lambda_i$$.


 * $$\operatorname{det}(A) = \prod_i \lambda_i$$.


 * $$\operatorname{tr}(A^k) = \sum_i \lambda_i^k$$.

The spectral radius of a square matrix or a bounded linear operator is the largest absolute value of its eigenvalues.
 * $$\rho(A) = \max \left \{ |\lambda_1|, \dotsc, |\lambda_n| \right \}.$$

Pseudoinverse
For $$ A \in \mathrm{M}(m,n;K) $$, a Moore–Penrose pseudoinverse of $$ A $$ is defined as a matrix $$ A^+ \in \mathrm{M}(n, m; K)$$ satisfying all of the following four criteria, known as the Moore-Penrose conditions:
 * 1) $$A A^+A = A\,\!$$       ($AA^{+}$ need not be the general identity matrix, but it maps all column vectors of $A$ to themselves);
 * 2) $$A^+A A^+ = A^+\,\!$$       ($A^{+}$ is a weak inverse for the multiplicative semigroup);
 * 3) $$(AA^+)^* = AA^+\,\!$$       ($AA^{+}$ is Hermitian); and
 * 4) $$(A^+A)^* = A^+A\,\!$$       ($A^{+}A$ is also Hermitian).

When $$ A $$ has full rank, $$ A^+ $$ can be computed easily. When $$A$$ has linearly independent columns the left-inverse is
 * $$ A^+ = (A^* A)^{-1} A^* \,.$$

When $$A$$ has linearly independent rows, the right inverse is
 * $$ A^+ = A^* (A A^*)^{-1} \,.$$

Operator theory

 * A bilinear map is a function combining elements of two vector spaces to yield an element of a third vector space, and is linear in each of its arguments.
 * A bilinear form on a vector space V is a bilinear map V × V → K, where K is the field of scalars.
 * B(u + v, w) = B(u, w) + B(v, w)    and     B(λu, v) = λB(u, v)
 * B(u, v + w) = B(u, v) + B(u, w)    and     B(u, λv) = λB(u, v)


 * Symmetric bilinear form: A bilinear form is symmetric if $$B(u,v)=B(v,u) \ \quad \forall u,v \in V$$
 * Skew-symmetric bilinear form: $$B(u,v)=-B(v,u) \ \quad \forall u,v \in V$$
 * A bilinear form B is called non-degenerate if for all v ∈ V, there exists w ∈ V , such that $$B(w,v) \neq 0$$