Kolmogorov's two-series theorem

In probability theory, Kolmogorov's two-series theorem is a result about the convergence of random series. It follows from Kolmogorov's inequality and is used in one proof of the strong law of large numbers.

Statement of the theorem
Let $$\left( X_n \right)_{n=1}^{\infty}$$ be independent random variables with expected values $$\mathbf{E} \left[ X_n \right] = \mu_n$$ and variances $$\mathbf{Var} \left( X_n \right) = \sigma_n^2$$, such that $$\sum_{n=1}^{\infty} \mu_n$$ converges in $$\mathbb{R}$$ and $$\sum_{n=1}^{\infty} \sigma_n^2$$ converges in $$\mathbb{R}$$. Then $$\sum_{n=1}^{\infty} X_n$$ converges in $$\mathbb{R}$$ almost surely.

Proof
Assume WLOG $$\mu_n = 0$$. Set $$S_N = \sum_{n=1}^N X_n$$, and we will see that $$\limsup_N S_N - \liminf_NS_N = 0$$ with probability 1.

For every $$m \in \mathbb{N}$$, $$\limsup_{N \to \infty} S_N - \liminf_{N \to \infty} S_N = \limsup_{N \to \infty} \left( S_N - S_m \right) - \liminf_{N \to \infty} \left( S_N - S_m \right) \leq 2 \max_{k \in \mathbb{N} } \left| \sum_{i=1}^{k} X_{m+i} \right|$$

Thus, for every $$m \in \mathbb{N}$$ and $$\epsilon > 0$$, $$\begin{align} \mathbb{P} \left( \limsup_{N \to \infty} \left( S_N - S_m \right) - \liminf_{N \to \infty} \left( S_N - S_m \right) \geq \epsilon \right) &\leq \mathbb{P} \left( 2 \max_{k \in \mathbb{N} } \left| \sum_{i=1}^{k} X_{m+i} \right| \geq \epsilon \ \right) \\ &= \mathbb{P} \left( \max_{k \in \mathbb{N} } \left| \sum_{i=1}^{k} X_{m+i} \right| \geq \frac{\epsilon}{2} \ \right) \\ &\leq \limsup_{N \to \infty} 4\epsilon^{-2} \sum_{i=m+1}^{m+N} \sigma_i^2 \\ &= 4\epsilon^{-2} \lim_{N \to \infty} \sum_{i=m+1}^{m+N} \sigma_i^2 \end{align}$$

While the second inequality is due to Kolmogorov's inequality.

By the assumption that $$\sum_{n=1}^{\infty} \sigma_n^2$$ converges, it follows that the last term tends to 0 when $$m \to \infty$$, for every arbitrary $$\epsilon > 0$$.