Stein's lemma

Stein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference &mdash; in particular, to James–Stein estimation and empirical Bayes methods &mdash; and its applications to portfolio choice theory. The theorem gives a formula for the covariance of one random variable with the value of a function of another, when the two random variables are jointly normally distributed.

Note that the name "Stein's lemma" is also commonly used to refer to a different result in the area of statistical hypothesis testing, which connects the error exponents in hypothesis testing with the Kullback–Leibler divergence. This result is also known as the Chernoff–Stein lemma and is not related to the lemma discussed in this article.

Statement of the lemma
Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a differentiable function for which the two expectations E(g(X) (X &minus; μ)) and E(g &prime;(X)) both exist. (The existence of the expectation of any random variable is equivalent to the finiteness of the expectation of its absolute value.) Then


 * $$E\bigl(g(X)(X-\mu)\bigr)=\sigma^2 E\bigl(g'(X)\bigr).$$

In general, suppose X and Y are jointly normally distributed. Then


 * $$\operatorname{Cov}(g(X),Y)= \operatorname{Cov}(X,Y)E(g'(X)).$$

For a general multivariate Gaussian random vector $$(X_1, ..., X_n) \sim N(\mu, \Sigma)$$ it follows that
 * $$E\bigl(g(X)(X-\mu)\bigr)=\Sigma\cdot E\bigl(\nabla g(X)\bigr).$$

Proof
The univariate probability density function for the univariate normal distribution with expectation 0 and variance 1 is


 * $$\varphi(x)={1 \over \sqrt{2\pi}}e^{-x^2/2}$$

Since $$\int x \exp(-x^2/2)\,dx = -\exp(-x^2/2)$$ we get from integration by parts:


 * $$E[g(X)X]

= \frac{1}{\sqrt{2\pi}}\int g(x) x \exp(-x^2/2)\,dx = \frac{1}{\sqrt{2\pi}}\int g'(x) \exp(-x^2/2)\,dx = E[g'(X)]$$.

The case of general variance $$\sigma^2$$ follows by substitution.

More general statement
Isserlis' theorem is equivalently stated as$$\operatorname{E}(X_1 f(X_1,\ldots,X_n))=\sum_{i=1}^{n} \operatorname{Cov}(X_1X_i)\operatorname{E}(\partial_{X_i}f(X_1,\ldots,X_n)).$$where $$(X_1,\dots X_{n})$$ is a zero-mean multivariate normal random vector.

Suppose X is in an exponential family, that is, X has the density


 * $$f_\eta(x)=\exp(\eta'T(x) - \Psi(\eta))h(x).$$

Suppose this density has support $$(a,b) $$ where $$ a,b $$ could be $$ -\infty ,\infty$$ and as $$x\rightarrow a\text{ or }b$$, $$ \exp (\eta'T(x))h(x) g(x) \rightarrow 0$$ where $$g$$  is any differentiable function such that $$E|g'(X)|<\infty$$ or  $$  \exp (\eta'T(x))h(x) \rightarrow 0 $$ if $$ a,b $$ finite. Then


 * $$E\left[\left(\frac{h'(X)}{h(X)} + \sum \eta_i T_i'(X)\right)\cdot g(X)\right] = -E[g'(X)]. $$

The derivation is same as the special case, namely, integration by parts.

If we only know $$ X $$ has support $$ \mathbb{R} $$, then it could be the case that $$ E|g(X)| <\infty \text{ and } E|g'(X)| <\infty $$ but $$ \lim_{x\rightarrow \infty} f_\eta(x) g(x) \not= 0$$. To see this, simply put $$g(x)=1 $$ and $$ f_\eta(x) $$ with infinitely spikes towards infinity but still integrable. One such example could be adapted from $$ f(x) = \begin{cases} 1 & x \in [n, n + 2^{-n}) \\ 0 & \text{otherwise} \end{cases} $$ so that $$ f$$ is smooth.

Extensions to elliptically-contoured distributions also exist.