Talk:Infinite divisibility (probability)

Negative binomial
Negative binomial is infinitely divisible?? Boris Tsirelson (talk) 17:33, 8 December 2008 (UTC)


 * Yes. Bear in mind that we don't mean the number of trials to get a specified number of successes, but rather the number of failures strictly before that specified number of successes, so that's it's supported on {0, 1, 2, 3, ...}.  And that the "number of successes" need not be an integer.  See negative binomial distribution. Michael Hardy (talk) 18:12, 8 December 2008 (UTC)


 * Well, it starts at 0. Now, geometric distribution is a special case of negative binomial. You claim that it is infinitely divisible. In particular, that you can get it as the distribution of a sum of two independent identically distributed random variables. Really? How would you do it? Boris Tsirelson (talk) 19:28, 8 December 2008 (UTC)


 * Yes, really. See logarithmic distribution.  Or just substitute the exponent &minus;1/2 as the appropriate parameter in the negative binomial distribution. Michael Hardy (talk) 19:40, 8 December 2008 (UTC)


 * PS: OK, go to negative binomial distribution. Look at the parameter r > 0.  Set r = 1/2. Michael Hardy (talk) 19:46, 8 December 2008 (UTC)


 * Thank you for the explanation. I must think more. Boris Tsirelson (talk) 19:54, 8 December 2008 (UTC)


 * Wow, I see! Quite interesting. Thank you again. I got many various new pieces of information from Wikipedia, but till now, not in probability theory... Boris Tsirelson (talk) 20:36, 8 December 2008 (UTC)

"Relevance to statistics" section - something is wrong
"normal distribution arises ... as, loosely speaking, the distribution of the average of an infinite number of identically distributed random variables, a closely related question is to ask which distributions can arise naturally as the sum of a finite number of identically distributed random variables." Really? An infinite divisible distribution corresponds (loosely speaking) to an infinite number of summands in the same sense as normal distribution. The distinction is not in the (finite or infinite) number of summands but in finiteness or infiniteness of the second moment. Boris Tsirelson (talk) 18:56, 20 February 2009 (UTC)
 * Yes, really. Did you want a complete rehearsal of all the other conditions besides independence under which a normal distribution arises? And did you want a complete statement of what is meant by an asymptotic distribution? Melcombe (talk) 09:54, 23 February 2009 (UTC)
 * I did not protest against the style of "loosely speaking", but rather against "sum of a finite number..." out of context. In principle I like both styles of presentation, informal and formal. And probably the best, for my opinion, is their combination: informal - first, and formal - closer to the end of the article, for an interested reader. On the other hand, the central limit theorem is given in detail elsewhere, and I'd prefer a link thereto rather than "complete rehearsal of all conditions". Boris Tsirelson (talk) 12:06, 23 February 2009 (UTC)
 * What the article does need is an informal introduction. I will move the now-hidden text there, and edit it down for an initial version. Melcombe (talk) 12:15, 23 February 2009 (UTC)

Lede
The lede needs some restructuring. As per WP:LEDE, is should begin by saying what infinite divisibility is, rather than why it is of interest. Essentially, then, the first and second sentences need to be exchanged, mutatis mutandis. —SlamDiego&#8592;T 03:36, 26 August 2009 (UTC)

Is k 0-based or 1-based?
In the equation,
 * $$\lim_{n\to\infty} \max_k \; P( \left| X_{nk} \right| > \varepsilon ) = 0 \text{ for every }\varepsilon > 0.$$

what are the legal values for k? The interpretation would seem to be very different for 0-based and 1-based enumeration of the random variables, $$\{X_k : k = 0, 1, \ldots \}$$ vs. $$\{X_k : k = 1, 2, \ldots \}$$ because, as n goes to infinity, the value of nk for the lowest value of k is finite for the former but goes to infinity for the latter. I am guessing that the answer is that 1-based is right, but am hoping that someone who knows for sure will edit the article to clarify this. Thanks — Q uantling (talk &#124; contribs) 21:05, 18 January 2011 (UTC)


 * No, sorry, the nk is not n times k, it is a pair of indices, n and k. Maybe it is better to write $$X_{n,k}$$, but the form $$X_{nk}$$ is quite usual. --Boris Tsirelson (talk) 05:58, 19 January 2011 (UTC)

Okay, a pair of indices then. What do they index? Is it the triangular array? In particular does that tell us that $$k \le n$$? Perhaps an explanation of this could go into the text. Also, the notation both earlier and later in the article has $$X$$ with only one index. Can we fix the article so that the number of indices for $$X$$ is consistent throughout the article? Or can we explain in the article why this apparent inconsistency is okay? Thanks! — Q uantling (talk &#124; contribs) 20:52, 19 January 2011 (UTC)


 * Yes, a triangular array, as written: "the limit of the sum of independent uniformly asymptotically negligible (u.a.n.) random variables within a triangular array approaches", and then again "On the other hand, for a triangular array of independent...". But I agree that all this is clear only to a knowledgeable reader. --Boris Tsirelson (talk) 06:11, 20 January 2011 (UTC)

I found a reference and now I get it. I made edits that would have made it clearer to my old self. Please comment or modify. — Q uantling (talk &#124; contribs) 14:21, 20 January 2011 (UTC)


 * Nice! Much better than before. A small remark: no need to link a single notion more than once in the whole article (for instance, indepenent). --Boris Tsirelson (talk) 19:26, 20 January 2011 (UTC)

Irrational divisibility of a distribution?
It appears to me that, under reasonable conditions, infinite divisibility of a distribution F with characteristic function φ is equivalent to the statement that φα is the characteristic function of a probability distribution for every α ∈ {1, 1/2, 1/3, 1/4, …}. Is it a stronger statement or an equivalent statement to demand that φα is the characteristic function of a probability distribution for every real α ∈ (0,1]? I'm not trying to do original research (or homework) here, but am wondering whether the discrete nature of the infinite divisibility (into an integral number of parts) is fundamentally part of the definition of infinite divisibility or is just a convenient way of defining it.  If you can find me a reference to a discussion on this, I will summarize it for the article. — Q uantling (talk &#124; contribs) 20:22, 20 January 2011 (UTC)


 * Hmm. Since φ will likely not be real and positive for some of its domain, it would probably be easier to work with the moment-generating function M(t) = E[etX] and its powers Mα, instead of the characteristic function φ(t) = E[eitX].  But, same question there. — Q uantling (talk &#124; contribs) 20:33, 20 January 2011 (UTC)


 * (a) Basically, yes, see Lévy process. (b) No, the moment generating function exists (that is, is finite in a neighborhood of the origin) only sometimes. --Boris Tsirelson (talk) 05:59, 21 January 2011 (UTC)

Is there a unique distribution such that X = X_1 + X_2 and all variables have the same distribution?
The parenthetical remark "not usually from the same distribution" seems to be an understatement. If the distributions are the same then the characteristic function satisfies phi(t) = (phi(t))^n which gives phi(t) = 1 where it is non-zero. Doesn't this mean that the distribution is degenerate at x = 0? — Preceding unsigned comment added by 121.74.101.3 (talk) 22:07, 4 November 2014 (UTC)


 * Sorry, I do not understand you. Where is the parenthetical remark "not usually from the same distribution"? Boris Tsirelson (talk) 22:39, 4 November 2014 (UTC)


 * Sorry, my mistake, I was looking at the other article on infinite divisibility which made a variation on that statement. My question still stands and it seems a nice remark that the other variables can only be from the same distribution in the special case when P(X=0) = 1.TerryM--re (talk) 07:54, 5 November 2014 (UTC)


 * Well, I see. Yes; and your "phi(t) = 1 where it is non-zero" is another understatement; phi is continuous (anyway), and therefore it is 1 everywhere, and so, indeed, the distribution is degenerate at x = 0. Boris Tsirelson (talk) 08:41, 5 November 2014 (UTC)

Generality of the converse
1) The article says that a Levy process yields an infinitely divisible distribution and vice-versa.

2) The article also says that an infinitely divisible distribution satisfying certain conditions yields an additive process.

How is this more information?

An additive process is more general than a Levy process. And conditions are required on the infinitely divisible process in 2), but not in 1).

So it seems like 2) is strictly weaker than 1) and easily implied by it.

2001:171B:2274:7C21:B13D:6306:7B8E:4356 (talk) 19:19, 12 April 2022 (UTC)