User:DaniWk/Multidimensional Chebyshev's inequality

Multidimensional Chebishev's inequality
Let X be an N-dimensional random variable with expected value $$\mu=\mathbb{E} \left[ X \right] $$ and covariance matrix $$V=\mathbb{E} \left[ \left(X - \mu \right) \left( X - \mu \right)^T \right] $$.

If $$V$$ is invertible (i.e., a strictly positive matrix), for any real number $$t>0$$:

\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t \right) \le \frac{N}{t^2} $$

Proof
$$V$$ is positive, so $$V^{-1}$$ is. Define the random variable

y = \left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) $$ $$y$$ is positive, then Markov's inequality holds:

\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t\right) = \mathrm{Pr}\left( \sqrt{y} > t\right) =\mathrm{Pr}\left( y > t^2 \right) \le \frac{\mathbb{E}[y]}{t^2} $$

Since $$V$$ is symmetric, a rotation matrix $$R$$ and a diagonal matrix $$D$$ exist such that

V = R^T \, D \, R $$

Since $$V$$ is positive $$D_{i,i}>0$$. Besides

V^{-1} = R^{-1} \, D^{-1} \, (R^T)^{-1} = R^T \, D^{-1} \, R $$ clearly $$\left[ D^{-1}\right]_{i,i} = \frac{1}{D_{i,i}}$$.

Define $$Z = R \, \left( X-\mu\right)$$.

The following identities hold:

\mathbb{E}\left[ Z \, Z^T \right] = R \, \mathbb{E}\left[ \left( X-\mu\right) \, \left( X-\mu\right)^T \right] \, R^T = R \, R^T \, D \, R \, R^T = D \quad \Rightarrow \quad \forall i \quad \mathbb{E}\left[ Z_i^2 \right] = D_{i,i} $$ and

y = Z^T \, R \, V^{-1} \, R^T \, Z = Z^T \, D^{-1} \, Z = \sum\limits_{i=1}^N \frac{Z_i^2}{D_{i,i}} $$. Then $$ \mathbb{E} [y] = \sum\limits_{i=1}^N \frac{\mathbb{E}\left[ Z_i^2 \right] }{D_{i,i}} = N $$.