Cramér's theorem (large deviations)

Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables. A weak version of this result was first shown by Harald Cramér in 1938.

Statement
The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:
 * $$ \Lambda(t)=\log \operatorname E [\exp(tX_1)]. $$

Let $$ X_1, X_2, \dots $$ be a sequence of iid real random variables with finite logarithmic moment generating function, i.e. $$ \Lambda(t) < \infty $$ for all $$ t \in \mathbb R $$.

Then the Legendre transform of $$ \Lambda $$:
 * $$ \Lambda^*(x):= \sup_{t \in \mathbb R} \left(tx-\Lambda(t) \right) $$

satisfies,


 * $$ \lim_{n \to \infty} \frac 1n \log \left(P\left(\sum_{i=1}^n X_i \geq nx \right)\right) = -\Lambda^*(x) $$

for all $$ x > \operatorname E[X_1]. $$

In the terminology of the theory of large deviations the result can be reformulated as follows:

If $$ X_1, X_2, \dots $$ is a series of iid random variables, then the distributions $$ \left(\mathcal L ( \tfrac 1n \sum_{i=1}^n X_i) \right)_{n \in \N}$$ satisfy a large deviation principle with rate function $$ \Lambda^* $$.