User:Michelebn2/One-symbol Information

In information theory, the one-symbol information of two random variables  X and Y is a quantity that measures the contribution to the mutual information I(X;Y) of a single symbol $$x \in X$$

As for mutual information,  the most common unit of measurement of One-symbol Information is the bit, when logarithms to the base 2 are used.