Talk:Conditional entropy

Untitled
Why does equivocation (information theory) redirect here? If the two terms are equivalent, can someone put equivocation in the lead? Stevage 11:27, 17 December 2005 (UTC)
 * This was done. 178.38.76.15 (talk) 15:02, 22 November 2014 (UTC)

WikiProject class rating
This article was automatically assessed because at least one WikiProject had rated the article as start, and the rating on other projects was brought up to start class. BetacommandBot 09:46, 10 November 2007 (UTC)

The article does not define the 'E' notation used. —Preceding unsigned comment added by 90.241.56.148 (talk) 12:55, 13 July 2008 (UTC)

The definition given in the last line of the "definition" section contradicts the definition given in the second line of the "Chain Rule" section. I think that the definition given in the second line of the "Chain Rule" section is correct. The problem is either the negative sign or the reciprocal inside the logarithm on the second math line of the "definition" section. — Preceding unsigned comment added by 150.231.246.1 (talk) 20:28, 9 January 2012 (UTC)

Anything having to do with mutual information
I(A,B) -- the mutual information appears without motivation or explanation. It is great to link to mutual information, but it should be explained, at the least, why understanding the relation to mutual information helps one understand conditional entropy. Also, the picture at the top of the page makes no sense without understanding this relationship. — Preceding unsigned comment added by 136.159.160.249 (talk) 05:59, 21 October 2012 (UTC)

Removed flag
I removed the "expert attention" flag from the "Bayes' rule" section. The complaint really concerned only one line, the statement about Bayes' rule being false for quantum conditional entropy. I moved this claim to the following section, which introduces quantum conditional entropy. The claim needs a citation. 178.38.76.15 (talk) 14:55, 22 November 2014 (UTC)

Conditional entropy not equal conditional expectation
I'm about to remove an incorrect equivalence, so I wanted to give a bit more detail here on the talk page. I wish conditional entropy was condition expectation but unfortunately

\Eta(Y|X=x) \not= \mathbb{E}[\operatorname{I}(Y)|X=x] = -\sum_{y\in\mathcal Y} {\Pr(Y = y|X=x) \log_2{\Pr(Y = y)}} $$ It's worth noting the contrast of

\Eta(Y|X) = \sum_{x\in\mathcal X} \Pr(X=x) \Eta(Y|X=x) $$ vs

\Eta(Y) = \mathbb{E}[\operatorname{I}(Y)] = \sum_{x\in\mathcal X} \Pr(X=x) \mathbb{E}[\operatorname{I}(Y)|X=x] $$