User:Memming/List of information theoretic relations

We plan to make a page that lists relations and non-relations of information theoretic quantities such as entropy $$H$$, mutual information $$I$$, and Kullback-Leibler divergence $$D_{\mathrm{KL}}$$.

Equivalences

 * $$I(X;Y) = H(X) + H(Y) - H(X,Y)$$
 * $$H(Y) + H(X|Y) = H(X,Y)$$
 * $$I(X_1, X_2 ; Y) = I(X_1 ; Y) + I(X_2;Y|X_1)$$ (chain rule for mutual information)

Inequalities

 * $$H(X) \ge H(X|Y) $$ or $$I(X;Y) \ge 0 $$

Non-relations

 * $$I(X_1,X_2;Y) \lesseqqgtr I(X_1;Y) + I(X_2;Y) $$ (related to redundancy and synergy)