Talk:Redundancy (information theory)

A lot of missing context

 * what is an ensemble X?
 * where does $$\mathcal{A}_{X}$$ come from?
 * what is $$\mathbb M$$? - I assume the alphabet of the sender, but it does not say.

This is a pattern in many of the information theory articles. — Preceding unsigned comment added by 37.201.147.172 (talk) 02:01, 18 July 2023 (UTC)

Definition of rate
Definition of rate (here and at Information theory) should agree with Entropy rate. 198.145.196.71 23:19, 13 September 2007 (UTC)
 * Done (by me). 198.145.196.71 01:12, 15 September 2007 (UTC)

Other notions of Redundancy
This paragraph:


 * Redundancy of compressed data refers to the difference between the expected compressed data length of $$n$$ messages $$L(M^n)$$ (or expected data rate $$L(M^n)/n$$) and the entropy $$nr$$ (or entropy rate $$r$$). (Here we assume the data is ergodic and stationary, e.g., a memoryless source.)  Although the rate difference $$L(M^n)/n-r$$ can be arbitrarily small as $$n$$ increased, the actual difference $$L(M^n)-nr$$, cannot, although it can be theoretically upper-bounded by 1 in the case of finite-entropy memoryless sources.

actually seems to describe redundancy as the difference between the absolute rate and the rate as defined above is the article. As such, it should perhaps be more fully explained in the main part of the article, or in its own section, since it is not really "another notion of redundancy". It needs to be explained more clearly, too. 198.145.196.71 16:22, 18 September 2007 (UTC)