User:165.91.13.98/sandbox

Statement of the lemma
Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a function for which the two expectations E(g(X) (X &minus; μ) ) and E( g &prime;(X) ) both exist (the existence of the expectation of any random variable is equivalent to the finiteness of the expectation of its absolute value). Then


 * $$\begin{align*}

E{g(X)(X-\theta)} &= \int_{-\infty}^{\infty} g(x)(x-\theta) f(x)dx\\ &=E{g'(X)} \end{align*}$$

In general, suppose X and Y are jointly normally distributed. Then


 * $$\operatorname{Cov}(g(X),Y)=E(g'(X)) \operatorname{Cov}(X,Y).$$

In order to prove the univariate version of this lemma, recall that the probability density function for the normal distribution with expectation 0 and variance 1 is


 * $$\varphi(x)={1 \over \sqrt{2\pi}}e^{-x^2/2}$$

and that for a normal distribution with expectation μ and variance σ2 is


 * $${1\over\sigma}\varphi\left({x-\mu \over \sigma}\right).$$

Then use integration by parts.