Talk:Kolmogorov's zero–one law

??
There is a minor sloppyness in language. The term $$\sum_{k=1}^\infty X_k$$ is the limit of the series $$\big(\sum_{k=1}^N X_k\big)_{N=1}^\infty$$.


 * That's not a series; that's a sequence! Michael Hardy 00:36, 1 Nov 2004 (UTC)

The later converges or not, the prior exists or not.


 * What does it mean for a sum of random variables not to exist? I think you've misunderstood the point.


 * An infinite sum of random variables does not exist if the probability that the sum does not converge (or does not exist, the wording is up to you) is positive. GaborPete (talk) 22:07, 2 April 2024 (UTC)

Don't know how to correct this without an overhead of explanation.

Btw., what are the exact requirements on $$X_k^{}$$ for the series to converge at all? Certainly $$\operatorname E (X_k) = 0$$ is needed, but not sufficient. I even believe the convergences of the series would be equivalent to $$\operatorname P\{X_k = 0\} = 1$$.


 * No on both accounts. See eg. the central limit theorem.


 * ouch, now I see where I was wrong: in many theorems the random variables have to be identically distributed, not so here. 217.230.28.82 12:16, 31 Oct 2004 (UTC)


 * The discussion is badly edited, so I don't know who is saying what and when, but bringing up the Central Limit Theorem was off track:


 * 1) The example in the article is fine, the writer is talking about the almost sure convergence of the sum, which is indeed a tail event. Almost sure convergence is what people usually mean by the convergence of a random sum, so the writing is not awful (at least right now).
 * 2) The Central Limit Theorem does NOT talk about the almost sure convergence of a sum of random variables. It is about convergence in distribution, which has to do with the article only as a negative example: Kolmogorov's 0-1 law implies that we cannot have almost sure convergence in any CLT. GaborPete (talk) 22:07, 2 April 2024 (UTC)

How about a rigorous definition for a tail event or tail sigma algebra? —Preceding unsigned comment added by 71.207.219.120 (talk) 07:48, 23 April 2009 (UTC)

Are $$F_k$$ and $$B_k$$ the same or do I miss something? Pr.elvis (talk) 14:15, 10 August 2009 (UTC)

A couple of issues I have with this article
First, I think you need to say more about the definition of a tail event, because consider:

Let $$X_i(\omega)\ :\ i \in \mathbb{N}$$ be i.i.d. Bernoulli random variables on $$\Omega$$ satisfying $$\operatorname{P}(X_i = 1) = 1/2$$. Define $$I_X \subset \mathbb{N}$$ as the set of all $$i \in \mathbb{N}$$ such that $$X_i = 1$$. Using the axiom of choice, choose a family $$\mathcal{A}$$ of subsets of $$\mathbb{N}$$ such that for each $$B \subset \mathbb{N}$$ there exists unique $$A \in \mathcal{A}$$ such that the symmetric difference of $$A$$ and $$B$$ is finite. Now let $$\mathcal{A}'$$ be a subset of $$\mathcal{A}$$ of cardinality $$2^{\aleph_0}$$ whose complement is also of cardinality $$2^{\aleph_0}$$. Let $$Z$$ be the set $$\{\omega \in \Omega : \exists A \in \mathcal{A}' \textrm{s.t.}\,I_X \bigtriangleup A\,\textrm{is}\,\textrm{finite}\}$$. Let $$\mathcal{F}$$ be the &sigma;-algebra on $$\Omega$$ generated by the $$X_i$$ and let $$\mathcal{G}$$ be the &sigma;-algebra generated by $$\mathcal{F} \cup \{Z\}$$. An arbitrary element of $$\mathcal{G}$$ can be expressed in the form $$(A\cap Z)\cup (B\cap Z^c)$$ where $$A,B \in \mathcal{F}$$. Define the probability of such an event to be $$(\mathop{P}(A) + \mathop{P}(B)) / 2$$. Then the event $$Z$$ is independent of all finite subsets of the $$X_i$$ and is uniquely determined by the 'tail' $$(X_n, X_{n+1}, X_{n+2}, \ldots)$$ for any $$n$$, but has probability 1/2.

The reason why this doesn't contradict the theorem is because the theorem should only be stated for events belonging to $$\mathcal{F}$$.

Another issue is that you define a tail event as being one that is independent of all finite subsets of the $$X_i$$, but this itself follows from the hypothesis that the $$X_i$$ are independent. In fact, under the article's definition of 'tail event', surely we don't even need to assume that the $$X_i$$ are independent for the result to be true (provided it's only stated for events in $$\mathcal{F}$$)? —Preceding unsigned comment added by 92.15.133.253 (talk) 10:38, 6 August 2010 (UTC)

More information needed
This article lacks a proof. 240F:7C:FC1A:1:6505:ECD6:F741:7B7E (talk) 01:06, 20 December 2014 (UTC)

Unclear sentence
An invertible measure-preserving transformation on a standard probability space that obeys the 0-1 law is called a Kolmogorov automorphism.

What "obeys" the 0-1 law? The space? The transformation? Neither makes sense. Indeed, it's hard to see how a thing can "obey" the 0-1 law since the 0-1 law is not formulated as a property. 76.118.180.76 (talk) 02:00, 15 December 2015 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified one external link on Kolmogorov's zero–one law. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20050115091525/http://kolmogorov.com/ to http://www.kolmogorov.com/

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Cheers.— InternetArchiveBot  (Report bug) 01:37, 12 December 2017 (UTC)