User:Kallikanzarid/Sandbox

Determinants
Determinants are closely linked to both vector and scalar triple product. The following formulas are commonly used for calculations in cartesian coordinate system:
 * $$\mathbf{a} \times \mathbf{b} =

\begin{vmatrix} \mathbf{i} & \mathbf{j} & \mathbf{k} \\ a_x & a_y & a_z \\ b_x & b_y & b_z \end{vmatrix}, \qquad \langle a, b, c \rangle = \begin{vmatrix} a_x & a_y & a_z \\ b_x & b_y & b_z \\ c_x & c_y & c_z \end{vmatrix}$$

Another example is using matrix determinants to distinguish between clockwise and counter-clockwise orientations of the three vertices a, b, c on the plane:
 * $$\begin{vmatrix}

1 & a_x & a_y \\ 1 & b_x & b_y \\ 1 & c_x & c_y \end{vmatrix}$$ This corresponds to embedding the plane in $$\mathbb{R}^3$$ as the plane z = 1 and then taking triple product of the vertices' radius vectors. Indeed, this quantity's sign is conserved under plane rotations, but changes under reflection through the origin (and thus under all improper rotations).

However, the last example has artificial feel to it, because the construction is not staightforward and involves many arbitrary choices (there is no preferred way of embedding the plane for this purpose, as long as doesn't pass through the origin: if it does, the radius vectors become coplanar and thus their triple product vanishes). Wedge product provides a much more straightforward solution that does not involve embedding the plane in a higher-dimensional space. Let's consider a volume form, or pseudoscalar
 * $$(c - a) \wedge (b - a) = \alpha e_1 \wedge e_2$$.

The behavior of sign(α) is exactly the same as of the determinant above, so it can be used to distinguish between orientations as well.

The connection between exterior product and matrix determinants can be explained by the following example. Let V be an n-dimensional vector space. Let's select a basis e1,...,en in V and a basis e1,...,en in V* dual to it. Then the following equality holds:
 * $$\mathbf{e}^{i_1} \wedge \ldots \wedge \mathbf{e}^{i_k}(\mathbf{v}_1 \ldots \mathbf{v}_k) =

\begin{vmatrix} v_1^{i_1} & v_1^{i_2} & \cdots & v_1^{i_k} \\ v_2^{i_1} & v_2^{i_2} & \cdots & v_2^{i_k} \\ \vdots & \vdots & \ddots & \vdots \\ v_k^{i_1} & v_k^{i_2} & \cdots & v_k^{i_k} \\ \end{vmatrix} $$, where k ≤ n, v1,...,vk are linearly independent vectors in V, and $$\mathbf{v}_i = v_i^j \mathbf{e}_j$$.