Blackwell-Girshick equation

The Blackwell-Girshick equation is an equation in probability theory that allows for the calculation of the variance of random sums of random variables. It is the equivalent of Wald's lemma for the expectation of composite distributions.

It is named after David Blackwell and Meyer Abraham Girshick.

Statement
Let $$ N $$ be a random variable with values in $$ \mathbb{Z}_{\ge 0} $$, let $$ X_1, X_2, X_3, \dots$$ be independent and identically distributed random variables, which are also independent of $$N$$, and assume that the second moment exists for all $$X_i$$ and $$N$$. Then, the random variable defined by
 * $$Y:=\sum_{i=1}^NX_i$$

has the variance
 * $$\operatorname{Var}(Y)=\operatorname{Var}(N)\operatorname{E}(X_1)^2+\operatorname{E}(N)\operatorname{Var}(X_1)$$.

The Blackwell-Girshick equation can be derived using conditional variance and variance decomposition. If the $$ X_i $$ are natural number-valued random variables, the derivation can be done elementarily using the chain rule and the probability-generating function.

Proof
For each $$n\ge 0$$, let $$\chi_n$$ be the random variable which is 1 if $$N$$ equals $$n$$ and 0 otherwise, and let $$Y_n:=X_1+\cdots+X_n$$. Then
 * $$\begin{align} \operatorname{E}(Y^2)

& = \sum_{n=0}^\infty \operatorname{E}(\chi_n Y_n^2)\\ &= \sum_{n=0}^\infty \operatorname{P}(N=n) \operatorname{E}(Y_n^2)\\ &= \sum_{n=0}^\infty\operatorname{P}(N=n) (\operatorname{Var}(Y_n)+\operatorname{E}(Y_n)^2)\\ &= \sum_{n=0}^\infty\operatorname{P}(N=n) (n \operatorname{Var}(X_1)+n^2\operatorname{E}(X_1)^2)\\ &= \operatorname{E}(N) \operatorname{Var}(X_1) + \operatorname{E}(N^2) \operatorname{E}(X_1)^2. \end{align}$$ By Wald's equation, under the given hypotheses, $$\operatorname{E}(Y)=\operatorname{E}(N) \operatorname{E}(X_1)$$. Therefore,
 * $$\begin{align} \operatorname{Var}(Y)&=\operatorname{E}(Y^2)-\operatorname{E}(Y)^2\\

&= \operatorname{E}(N) \operatorname{Var}(X_1) + \operatorname{E}(N^2) \operatorname{E}(X_1)^2 - \operatorname{E}(N)^2 \operatorname{E}(X_1)^2 \\ &= \operatorname{E}(N) \operatorname{Var}(X_1) + \operatorname{Var}(N) \operatorname{E}(X_1)^2, \end{align}$$ as desired.

Example
Let $$ N $$ have a Poisson distribution with expectation $$ \lambda $$, and let $$X_1, X_2, \dots$$ follow a Bernoulli distribution with parameter $$ p $$. In this case, $$Y$$ is also Poisson distributed with expectation $$\lambda p$$, so its variance must be $$\lambda p$$. We can check this with the Blackwell-Girshick equation: $$N$$ has variance $$\lambda$$ while each $$X_i$$ has mean $$p$$ and variance $$p(1-p)$$, so we must have
 * $$ \operatorname{Var}(Y)= \lambda p^2+\lambda p (1-p) = \lambda p $$.

Application and related concepts
The Blackwell-Girshick equation is used in actuarial mathematics to calculate the variance of composite distributions, such as the compound Poisson distribution. Wald's equation provides similar statements about the expectation of composite distributions.

Literature

 * For an example of an application: