Wikipedia:Reference desk/Archives/Mathematics/2013 November 3

= November 3 =

Entropy of uniform distribution
Melbourne, Australia

Looking at the entries for uniform distribution the entropy is given as ln(n) for the discrete case, and ln(b-a) for the continuous case. For a normalised distribution in the interval [0,1] the discrete case agrees with Shannon's definition of entropy, but the continuous case produces entropy of 0. Is this correct? Rjeges (talk) 10:06, 3 November 2013 (UTC)
 * I think it would be more accurate to say that the entropy in the continuous case is undefined. Looie496 (talk) 14:30, 3 November 2013 (UTC)


 * From Differential entropy:


 * Let X be a random variable with a probability density function f whose support is a set $$\mathbb X$$. The differential entropy h(X) or h(f) is defined as


 * $$h(X) = -\int_\mathbb{X} f(x)\log f(x)\,dx$$.


 * and


 * One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has negative differential entropy


 * $$\int_0^\frac{1}{2} -2\log(2)\,dx=-\log(2)\,$$.


 * Thus, differential entropy does not share all properties of discrete entropy.


 * Duoduoduo (talk) 15:58, 3 November 2013 (UTC)