Information source (mathematics)

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet &Gamma;, having a stationary distribution.

The uncertainty, or entropy rate, of an information source is defined as


 * $$H\{\mathbf{X}\} = \lim_{n\to\infty} H(X_n | X_0, X_1, \dots, X_{n-1})$$

where


 * $$ X_0, X_1, \dots, X_n$$

is the sequence of random variables defining the information source, and


 * $$H(X_n | X_0, X_1, \dots, X_{n-1})$$

is the conditional information entropy of the sequence of random variables. Equivalently, one has


 * $$H\{\mathbf{X}\} = \lim_{n\to\infty}

\frac{H(X_0, X_1, \dots, X_{n-1}, X_n)}{n+1}.$$