User:Hippasus/Sandbox

In probability theory, the probability distribution of the sum of two independent random variables is the convolution of their individual distributions. Many distributions have well known convolutions. The following is a list of these convolutions. Each statement is of the form
 * $$\sum_{i=1}^n X_i \sim Y$$

where $$X_1, X_2,..., X_n\,$$ are independent and identically distributed.

Discrete Distributions

 * $$\sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p)$$
 * $$\sum_{i=1}^n \mathrm{Binomial}(n_i,p) \sim \mathrm{Binomail}(\sum_{i=1}^n n_i,p)$$
 * $$\sum_{i=1}^n \mathrm{NegativeBinomial}(n_i,p) \sim \mathrm{NegativeBinomail}(\sum_{i=1}^n n_i,p)$$
 * $$\sum_{i=1}^n \mathrm{Geometric}(p) \sim \mathrm{NegativeBinomial}(n,p)$$
 * $$\sum_{i=1}^n \mathrm{Poisson}(\lambda_i) \sim \mathrm{Poisson}(\sum_{i=1}^n \lambda_i)$$

Continuous Distributions

 * $$\sum_{i=1}^n \mathrm{Normal}(\mu_i,\sigma_i^2) \sim \mathrm{Normal}(\sum_{i=1}^n \mu_i, \sum_{i=1}^n \sigma_i^2)$$
 * $$\sum_{i=1}^n \mathrm{Gamma}(\alpha_i,\beta) \sim \mathrm{Gamma}(\sum_{i=1}^n \alpha_i,\beta)$$
 * $$\sum_{i=1}^n \mathrm{Exponential}(\theta) \sim \mathrm{Gamma}(n,\theta)$$
 * $$\sum_{i=1}^n \chi^2(r_i) \sim \chi^2(\sum_{i=1}^n r_i)$$
 * $$\sum_{i=1}^r N^2(0,1) \sim \chi^2_r \,\!$$
 * $$\sum_{i=1}^r N^2(0,1) \sim \chi^2_r $$
 * $$\sum_{i=1}^n(X_i - \bar X)^2 \sim \sigma^2 \chi^2_{n-1} \qquad \mathrm{where} \quad X_i \sim N(\mu,\sigma^2) \quad \mathrm{and} \quad \bar X = \frac{1}{n} \sum_{i=1}^n X_i \,\!$$.

Example Proof
There are various ways to prove the above relations. A straightforward technique is to use the moment generating function, which is unique to a given distribution.

Proof that $$\sum_{i=1}^n \mathrm{Bernoulli}(p) \sim \mathrm{Binomial}(n,p)$$

 * $$X_i \sim \mathrm{Bernoulli}(p) \quad 0<p<1 \quad 1 \le i \le n$$
 * $$Y=\sum_{i=1}^n X_i$$
 * $$Z \sim \mathrm{Binomial}(n,p) \,\!$$

The moment generating function of each $$X_i$$ and of $$Z$$ is
 * $$M_{X_i}(t)=1-p+pe^t \qquad M_Z(t)=(1-p+pe^t)^n$$

where t is within some neighborhood of zero.


 * $$M_Y(t)=E(e^{t\sum_{i=1}^n X_i})=E(\prod_{i=1}^n e^{tX_i})=\prod_{i=1}^n E(e^{tX_i})

=\prod_{i=1}^n (1-p+pe^t)=(1-p+pe^t)^n=M_Z(t)$$

The expectation of the product is the product of the expectations since each $$X_i$$ is independent. Since $$Y$$ and $$Z$$ have the same moment generating function they must have the same distribution.