Wikipedia:Reference desk/Archives/Mathematics/2010 March 31

= March 31 =

Variance
Suppose we have identically distributed random variables Xi's. Then Var(sum of Xi's, from i=1 to n) = nVar(X)

Is the above true only if the Xi's are all independent? If they are not independent, then is the above not equal to nVar(X)? —Preceding unsigned comment added by 70.68.120.162 (talk) 03:23, 31 March 2010 (UTC)


 * In general, Var(A+B) = Var(A) + Var(B) + 2 Cov(A, B). So you can see that even in the case of two variables with identical distributions and non-zero covariance, the formula breaks down. However, it is possible to have rvs that are not independent, but have zero covariance, so independence is sufficient for your formula to hold, but not necessary. —Preceding unsigned comment added by ConMan 05:58, 31 March 2010 (UTC)
 * The simplest counterexample is when all variables are equal. Then $$\mathbb{V}\left[\sum X_i\right]=\mathbb{V}[nX]=n^2\mathbb{V}[X]$$. -- Meni Rosenfeld (talk) 08:11, 31 March 2010 (UTC)

...and more generally if you have more than two random variables, you have something like this:
 * var(X + Y + Z) = var(X) + var(Y) + var(Z) + 2cov(X,Y) + 2cov(X,Z) + 2cov(Y,Z),

and so on. So if the sum of the covariances is 0, then the variance of the sum equals the sum of the variances, even if the individual covariances are not 0. Michael Hardy (talk) 18:59, 31 March 2010 (UTC)