User:PatrickFoucault/sandbox

Vectors represent objects or quantities that are often required to be invariant under some group of transformations such as a change of basis. When the basis is changed the vector components are required to change to preserve invariance. The components of a contravariant vector transform with the inverse of the basis transformation, the components of a covariant vector transform with the same transformation as the basis.

Simple Example
A point in space has no meaning unless accompanied with some method of describing the point, in which case the description involves the relationship of the point to other points, one of which may be the origin of some initial coordinate system. When the coordinate system is changed by a change of basis the relationship, such as distance, between points maybe required to be invariant.

The Einstein summation convention is used; to make it clearer when this convention is employed only $$k$$ is used as the dummy summation index and $$k$$ is not used in any other context.

An arbitrary point in $$\mathcal{R}^n$$ has coordinates $$\textbf{p}_O=(p^1,\cdots,p^n)$$. Introduce any basis vectors $$X=\{\textbf{x}_1,\cdots,\textbf{x}_n\}$$, with $$\textbf{x}_j  =(x_j^1,\cdots,x_j^n)$$, which generate coordinate vectors $$\textbf{p}_{X}=(v^1,\cdots,v^n)$$, such that $$\textbf{p}_O= v^k\textbf{x}_k$$ and hence $$p^i=x^i_kv^k$$. In matrix notation $$\textbf{p}_O= \textbf{A}_{X} \textbf{p}_X$$ with $$x^i_j$$ the $$(i,j)$$ element of matrix $$\textbf{A}_{X}$$ and  $$\textbf{p}_O$$, $$\textbf{p}_X$$ treated as column vectors.

Alternative basis vectors $$E=\{\textbf{e}_1,\cdots,\textbf{e}_n\}$$ with $$\textbf{e}_j  =(e_j^1,\cdots,e_j^n)$$ generating $$\textbf{p}_{E}=(r^1,\cdots,r^n)$$ such that $$\textbf{p}_O= r^k\textbf{e}_k$$, $$p^i =e^i_kr^k$$, $$\textbf{p}_O=  \textbf{A}_{E} \textbf{p}_E$$.

Write $$\textbf{e}_j= b^k_j\textbf{x}_k$$ then $$e^i_j= x^i_kb^k_j$$ and $$\textbf{A}_E = \textbf{A}_X\textbf{B}_E$$ with $$e^i_j$$, $$b^i_j$$ the $$(i,j)$$ element of matrices $$\textbf{A}_E$$, $$\textbf{B}_E$$.

From $$\textbf{A}_X \textbf{p}_X = \textbf{A}_E \textbf{p}_E =  \textbf{A}_X\textbf{B}_E \textbf{p}_E$$ then $$\textbf{p}_X\mapsto  \textbf{p}_E$$ is given by $$\textbf{p}_E = \textbf{B}^{-1}_E\textbf{p}_X$$ and hence $$X\mapsto E$$ is a contravariant transformation.

This is simplistic, requires only elementary understanding of a vector, but still leaves elementary manipulations to the reader. It ties in with the idea of a matrix as a transformation and leads into the more difficult concepts of a dual space, covariant transformations and tensors. The notation is consistent with elementary tensor notation.

I advocate the article be organised along the following lines. The demonstrations should be clear about the exact transformations.
 * A very brief heuristic introduction about components transforming contra or co to the basis
 * Brief history and etymology sections
 * The above demonstration that a position vector always transforms contra. This also introduces the agreed notation.
 * A similar demonstration regarding a vector that we would like to transform co
 * Statement and link regarding the dual space and linear functionals
 * Demonstration of a co transformation using a specific linear functional which should be the dot product.
 * Discussion and links regarding tangents and normals and their use as bases.
 * Mention of how this develops into tensors with links.