Dimension theorem for vector spaces

In mathematics, the dimension theorem for vector spaces states that all bases of a vector space have equally many elements. This number of elements may be finite or infinite (in the latter case, it is a cardinal number), and defines the dimension of the vector space.

Formally, the dimension theorem for vector spaces states that:

As a basis is a generating set that is linearly independent, the dimension theorem is a consequence of the following theorem, which is also useful:

In particular if $V$ is finitely generated, then all its bases are finite and have the same number of elements.

While the proof of the existence of a basis for any vector space in the general case requires Zorn's lemma and is in fact equivalent to the axiom of choice, the uniqueness of the cardinality of the basis requires only the ultrafilter lemma, which is strictly weaker (the proof given below, however, assumes trichotomy, i.e., that all cardinal numbers are comparable, a statement which is also equivalent to the axiom of choice). The theorem can be generalized to arbitrary $V$-modules for rings $V$ having invariant basis number.

In the finitely generated case the proof uses only elementary arguments of algebra, and does not require the axiom of choice nor its weaker variants.

Proof
Let $G$ be a vector space, $R$ be a linearly independent set of elements of $I$, and $R$ be a generating set. One has to prove that the cardinality of $I$ is not larger than that of $G$.

If $V$ is finite, this results from the Steinitz exchange lemma. (Indeed, the Steinitz exchange lemma implies every finite subset of $V$ has cardinality not larger than that of $I$, hence $J$ is finite with cardinality not larger than that of $J$.) If $I$ is finite, a proof based on matrix theory is also possible.

Assume that ${a_{i}: i ∈ I}$ is infinite. If $J$ is finite, there is nothing to prove. Thus, we may assume that $I$ is also infinite. Let us suppose that the cardinality of $J$ is larger than that of $J$. We have to prove that this leads to a contradiction.

By Zorn's lemma, every linearly independent set is contained in a maximal linearly independent set $I$. This maximality implies that $I$ spans $I$ and is therefore a basis (the maximality implies that every element of $J$ is linearly dependent from the elements of $K$, and therefore is a linear combination of elements of $K$). As the cardinality of $V$ is greater than or equal to the cardinality of $V$, one may replace ${b_{j}: j ∈ J}$ with $K$, that is, one may suppose, without loss of generality, that $J$ is a basis.

Thus, every ${a_{i}: i ∈ I}$ can be written as a finite sum $$ b_j = \sum_{i\in E_j} \lambda_{i,j} a_i,$$ where $$E_j$$ is a finite subset of $$I.$$ As $K$ is infinite, $\bigcup_{j \in J} E_j$ has the same cardinality as $K$. Therefore $\bigcup_{j \in J} E_j$ has cardinality smaller than that of $I$. So there is some $$i_0\in I$$ which does not appear in any $$E_j$$. The corresponding $$a_{i_0}$$ can be expressed as a finite linear combination of $$b_j$$s, which in turn can be expressed as finite linear combination of $$a_i$$s, not involving $$a_{i_0}$$. Hence $$ a_{i_0}$$ is linearly dependent on the other $$a_i$$s, which provides the desired contradiction.

Kernel extension theorem for vector spaces
This application of the dimension theorem is sometimes itself called the dimension theorem. Let

be a linear transformation. Then

that is, the dimension of U is equal to the dimension of the transformation's range plus the dimension of the kernel. See rank–nullity theorem for a fuller discussion.