Talk:Chernoff's inequality

Comments
I added an article on Chernoff bound which though a special case of the inequality here, merits special attention. Perhaps the articles could be combined in some form. CSTAR 23:08, 21 Jun 2004 (UTC)

Do they really have to be discrete random variables? (At this moment I'm too lazy to figure this out for myself, but I'll probably do so soon.) Michael Hardy 22:51, 24 Jun 2004 (UTC)

Am I right in thinking that &mu; is the mean of X? -- Anon

I always thought that "Chernoff inequality" refers to the inequality on gaussian variable $$\xi \sim N(0,1)$$ and differentiable function $$g$$ such that $${\mathrm E} g(\xi) = 0$$ and states that in this case $$\mathrm{E} g^2(\xi) \le \mathrm{E} (g'(\xi))^2$$. I promise to write on this subject later. Amir Aliev 11:18, 23 May 2006 (UTC)

Proof
Can we see a proof of this inequality? Mwilde 19:25, 5 Mar 2005 (UTC)

Mistake?
Is the inequality correct, as stated?


 * $$P(\left|X\right|\geq k\sigma)\leq 2e^{-k^2/4}$$

This article looks like a copy of Theorem 3 (page 6) from paper

http://www.math.ucsd.edu/~fan/wp/concen.pdf

But in that paper we have:


 * $$P(\left|X\right|\geq k\sigma)\leq 2e^{-k^2/{4n}}$$

with an additional $$n$$. So which one is correct?

~Ruksis~


 * The second one, I think. Clearly it has to depend on n. Michael Hardy 20:01, 21 July 2006 (UTC)


 * No. The first one is correct. It depends on n through $$\sigma^2 = \sum_{i=1}^n \mathrm{Var}(X_i)$$. Daniel Hsu, 25 August 2006

The referred to paper is likely the source of confusion. It talks about $$\sigma^2$$ being the variance of $$X_i$$, but that doesn't make sense since the various variables $$X_i$$ could have distinct variances. Earlier versions in Wikipedia have the same mistake. If all the $$X_i$$ would have the same variance, say if they're identically distributed, and $$\sigma^2$$ would denote the variance of (all the) $$X_i$$, then the Chernoff bounds needs the $$n$$ in $$2 exp(k^2/(4n))$$. In its current, more general, form not. Peter van Rossum, 27 September 2006.

Temporary solution
The claim as stated was not true. For example, take all $$X_i$$ to be $$\pm 1/n^2$$ with probability $$1/2-1/n^4$$ and $$\pm 1$$ with probability $$1/n^4$$. Then the variance of X is 1/n but X does not have exponential distribution: for example, the probability that $$X>1$$ is $$1/n^3$$ rather than $$e^{-n^2}$$.

I've corrected the mistake in a silly way. The guy who put this here originally should find out the correct version of the claim in the generality he had in mind and put it here.

Uffish

Bounds on $$X_i$$
Should


 * $$\left|X_i\right|= 1\, $$ for all i

actually be


 * $$\left|X_i\right|\leq 1\, $$ for all i?

Requiring $$\left|X_i\right| = 1$$ with $$\left.E[X_i] = 0\right.$$ is rather limiting ($$X_i = \pm 1$$).

Parubin 22:29, 26 November 2006 (UTC)

Chernoff bounds
The bound in this article was known long before Chernoff (see e.g. S.Bernstein's book on probability, 192?). To my opinion it is only confusing to call this Chernoff's inequality; also, it is not reasonable to have 3 articles with more or less the same name. If noone objects, I will unite all the 3 and revise the terminology Sodin 02:29, 9 August 2007 (UTC)

Merger Proposal
In the current version, the inequality stated in the article is incorrect. To my opinion, all the (correct) content should be transferred to Chernoff bound. Please comment. Sasha 12 September 2008 —Preceding undated comment was added at 19:58, 12 September 2008 (UTC).