Talk:Rényi entropy

Why &alpha;=1 is special
Section doesn't feel like it's quite there yet. Maybe consider what happens when you're using a Renyi entropy as a measure of ecological diversity, and then realise that you need to split one species into two... -- Jheald 22:19, 23 January 2006 (UTC).

The first thing I saw was 1/(1-1) = inf Full Decent (talk) 16:07, 11 December 2009 (UTC)


 * That bothered me too until I realized that when α = 1 the sum of all probabilities is 1 and log(1) = 0. Hence one wants to know how 0 times infinity is approached as α approaches 1.
 * For the case of all probabilities equal, this is answered by the fourth sentence of the definition, starting "If the probabilities are ...". It is easily seen that H_α(X) = log(n) independently of α (and independently of the base). So in that case the limit as α → 1 remains log(n).  Vaughan Pratt (talk) 11:12, 18 October 2020 (UTC)

To add
Renyi entropy was defined axiomatically in Renyi's Berkeley entropy paper. In this, a weakening of one of the Shannon axioms results in Renyi entropy; that's why &alpha;=1 is special. Also, some of Renyi entropy's applications - Statistical physics, General statistics, Machine learning, Signal processing, Cryptography (a measure of randomness, robustness), Shannon theory (generalizing, proving theorems), Source coding - should be added with context. I don't have all this handy right now, but I'm sure each piece of this is familiar to at least one person reading this page.... Calbaer 05:59, 5 May 2006 (UTC)
 * Also the continuous case is missing. -- Zz 11:58, 24 October 2006 (UTC)

non-decreasing?
The statement that $$H_\alpha$$ is non-decreasing in $$\alpha$$ seems to contradict the statement that $$H_\infty < H_2$$. Also, should that be a weak inequality? LachlanA 23:24, 21 November 2006 (UTC)


 * Mathworld says they're non-decreasing. I think that's an error; it probably depends on whether you're using positive or negative entropies. I've fixed the inequalities, an obvious example of $$H_\infty = H_2 = 2H_\infty$$ is $$n=1, p_i = 1$$. ⇌Elektron 18:58, 29 June 2012 (UTC)

Does the Heisenberg model need a mention in the intro?
I would suggest removing the sentence "In the Heisenberg XY spin chain model, the Rényi entropy as a function of α can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group.[2][3]" from the beginning of the article. This is quite technical, and in no way an important fact about Rényi entropy, but rather about the Heisenberg model. It could be moved elsewhere in the article if so desired, although I personally don't see this as necessary. — Preceding unsigned comment added by 193.190.84.1 (talk • contribs) 10:32, 26 January 2018 (UTC)


 * Makes sense. Go for it!  Jheald (talk) 10:36, 26 January 2018 (UTC)

Inconsistency with f-divergence article
This article defines the alpha-divergence as the Renyi divergence of order alpha.

However F-divergence defines the alpha-divergence and Renyi-divergence as different but closely related values, specifically

$$R_\alpha(P\| Q) = \frac{1}{\alpha - 1}\ln (1+\alpha(\alpha-1)D_\alpha(P\|Q))$$

Which definition is more correct?

Snackematician (talk) 18:36, 11 July 2024 (UTC)