Talk:F-divergence

Instances of f-divergences
In the table, what is t? I guess t=P(x)-Q(x) but I couldn't tell for sure. Metaxal (talk) 12:14, 4 July 2013 (UTC)

Incosistent with KL-divergence article
Article on KL-divergence defines it as:


 * $$D_{\mathrm{KL}}(P\|Q) = \sum_i \ln\left(\frac{P(i)}{Q(i)}\right) P(i)$$

while based on this article one would derive the KL to be:


 * $$D_{\mathrm{KL}}(P\|Q) = D_{f=-\ln}(P\|Q) = \sum_i -\ln\left(\frac{P(i)}{Q(i)}\right) Q(i) = \sum_i \ln\left(\frac{Q(i)}{P(i)}\right) Q(i)$$

I guess both of them cannot be right because the divergence is not symmetric. Can somebody with the necessary knowledge decide which is the right formulation? Tomash (talk) 10:34, 16 August 2014 (UTC)

Instead of taking f(t) = -ln (t), take f(t) = t ln t:

$$ KL(P\|Q) = \sum_i P_i \ln(P_i/Q_i) = \sum_i Q_i (P_i/Q_i) \ln(P_i/Q_i) = \sum_{i} f(P_i/Q_i) Q_i = D_f(P\|Q) $$

--David Pal (talk) —Preceding undated comment added 12:23, 16 May 2016 (UTC)

Add visual examples
This would be much easier to understand if there were a few examples of distributions, with the distances given between pairs of distributions via various measures. ★NealMcB★ (talk) 16:33, 1 March 2021 (UTC)

Jensen-Shannon imposter
Whatever this is, it does NOT generate the Jensen-Shannon divergence! I don't know what it generates. It certainly generates a symmetric f-divergence, but it is NOT the Jensen-Shannon divergence!

$$ f(t)=  \frac{1}{2} [(t+1)\log \big(\frac{2}{t+1}\big)+t\log t ] $$

I explicitly calculated the $$D_f$$ between the two coin-toss distributions with heads probability 1/3 and 2/3. I got

$$D_f(1/3 \| 2/3) = \frac 2 3 \log\frac{32}{27}$$

which is not equal to

$$D_{JS}(1/3 \| 2/3) = \frac 1 6 \log 4$$

And there is no way to fix this. There is no way to find some $$a, c$$ such that $$ a(f(t) + c(t-1)) = f_{JS}(t)$$.

Whoever knows what it actually generates please add it back with the proper name attached.

pony in a strange land (talk) 01:18, 24 May 2022 (UTC)