Talk:Moment-generating function

Hyphenation
Spelling question: I've never (before now) seen the name spelled with a hyphen. Searches of Math Reviews (MathSciNet) and Current Index to Statistics show an overwhelming preference for no hyphen. Should the title, at least, be changed (move the article to "Moment generating function" with a redirect)? Zaslav 18:08, 8 December 2006 (UTC)


 * Sir Ronald Fisher always used the hyphen in "moment-generating function". This is an instance of the fact that in this era the traditional hyphenation rules are usually not followed in scholarly writing, nor in advertising or package labelling, although they're still generally followed in newspapers, magazines, and novels.  This particular term seldom appears in novels, advertisements, etc.  Personally I prefer the traditional rules because in some cases they are a very efficient disambiguating tool. Michael Hardy 20:03, 8 December 2006 (UTC)


 * Some literature sources also seem to use the hyphen. Example Paul L. Meyer's Introductory Probability and Statistical Applications. Apoorv020 (talk) 20:23, 10 October 2009 (UTC)


 * We really should have consistency on this throughout the article. I personally never use a hyphen, and my opinion on this is that we should probably just go with the more common usage. Perhaps this is something for RfC? Either way, we could simply state at the beginning of the article that either usage is used in literature. Blahb31 (talk) 22:42, 28 March 2015 (UTC)

Terms
I would like _all_ the terms such as E to be defined explicitly. Otherwise these articles are unintelligible to the casual reader. I would have thought that all terms in any formula should be defined every any article, or else reference should be made to some common form of definition of terms for that context. How about a bit more help for the randomly browsing casual student? I would like to see a recommendation in the Wikipedia "guidelines for authors" defining some kind of standard for this, otherwise it is very arbitrary which terms are defined and which are expected to be known. —Preceding unsigned comment added by 220.253.60.249 (talk • contribs)


 * Seriously, this article is written for people who already know this stuff apparently. The article doesn't really say what E is, and for that matter what is t?  Seriously, for M(t) what is t?      misli  h  20:33, 10 July 2009 (UTC)


 * t is the dummy argument used to define the function, like the x when one defines the squaring function &fnof; by saying &fnof;(x) = x2. And it says so in the second line of the article, where it says t &isin; R.  If you're not following that, then what you don't know that you'd need to know to understand it is the topics of other articles, not of this one.  In cases where the random variable is measured in some units, the units of t must be the reciprocal of the units of x.  As for what E is, there is a link to expectation in the next line.  Anyone who knows the basic facts about probability distributions has seen this notation. Michael Hardy (talk) 22:08, 10 July 2009 (UTC)

Certainly one could put in links to those things, but this article is the wrong place to explain what "E" is, just as an article about Shakespeare's plays is the wrong place to explain how to spell "Denmark", saying that the "D" represents the initial sound in "dog", etc.

This is not written for people who already know this material.

It is written for people who already know what probability distributions are and the standard basic facts about them. Michael Hardy (talk) 21:47, 10 July 2009 (UTC)

Thanks for the sarcasm, hope you feel a little better about yourself. misli h  23:32, 5 August 2009 (UTC)


 * It is strong and unusual to say that MGF for a rv X "...is an alternative definition

of its probability distribution." —Preceding unsigned comment added by 71.199.181.122 (talk) 21:46, 28 September 2010 (UTC)

There is a link to the expected value operator wiki. It would probably clutter articles to have detailed explanations of each preceding idea necessary to discuss the current, but it might be a good idea to include wikis that should be understood previous to reading the current wiki. —Preceding unsigned comment added by 141.225.193.194 (talk) 01:32, 31 January 2011 (UTC)

I agree with a lot of the above, but I think it should be stated explicitly that t is just a dummy variable with no intrinsic meaning. Thomas Tvileren (talk) 20:52, 15 November 2012 (UTC)

Definition
the definition of the n-th moment is wrong, the last equality is identically zero, as the nth derivative of 1 evaluated at t=0 will always be zero. the evaluation bar must be placed at the end (so we know we are differentiating Mx(t) n times and evaluating it at zero).


 * The only thing I find here that resembles a definition of the nth moment is where it says:


 * the nth moment is given by


 * $$E\left(X^n\right)\,$$


 * That definition is correct.


 * I take the expression


 * $$\left.\frac{\mathrm{d}^n}{\mathrm{d}t^n}\right|_{t=0} M_X(t)$$


 * to mean that we are first differentiating n times and then evaluating at zero. Unless you were referring to something else, your criticism seems misplaced. Michael Hardy 22:05, 8 April 2006 (UTC)

Please provide a few examples, e.g. for a Gaussian distribution.
 * How about adding something like this?
 * For the Gaussian distribution
 * $$f(x)=\frac 1 {\sqrt{2\pi}\sigma} e^{-(x-\mu)^2/2\sigma^2}, $$
 * the moment-generating function is
 * $$M(t)=\int_{-\infty}^{\infty}e^{tx}f(x)\,dx=\frac 1 {\sqrt{2\pi}\sigma} \int_{-\infty}^{\infty} \exp \left[tx-\frac{(x-\mu)^2}{2\sigma^2}\right]dx$$
 * $$=\frac 1 {\sqrt{2\pi}\sigma}\int_{-\infty}^{\infty} \exp\left[\frac{-1}{2\sigma^2} \left((x-\mu)^2-2\sigma^2tx\right)\right]dx.$$


 * Completing the square and simplifying, one obtains
 * $$M(t) = \exp \left(\mu t + \frac 1 2 \sigma^2 t^2 \right).$$
 * (mostly borrowed from article normal distribution.) I don't know if there's enough space for a complete derivation.  The "completing the square" part is rather tedious.  -- 130.94.162.64 00:37, 17 June 2006 (UTC)

I would also like to see some more in the article about some basic properties of the moment-generating function, such as convexity, non-negativity, the fact that M(0) always equals one, and also some other not-so-obvious properties (of which I lack knowledge) indicating what the mgf is used for. --130.94.162.64 00:55, 17 June 2006 (UTC)

Also, is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral" When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? If not, the quoted statement is wrong and should be removed from the article. —Preceding unsigned comment added by Sjayzzang (talk • contribs) 20:02, 15 April 2009 (UTC)
 * Certainly a sum is a Riemann–Stieltjes integral. One would hope that that article would be crystal-clear about that.  I'll take a look. Michael Hardy (talk) 21:55, 10 July 2009 (UTC)
 * ...I see: that article is not explicit about that. Michael Hardy (talk) 21:56, 10 July 2009 (UTC)

Vector of random variables or stochastic process
We should mention the case when X is a vector of random variables or a stochastic process. Jackzhp 22:29, 3 September 2006 (UTC)


 * I've added a brief statement about this. (Doubtless more could be said about it.) Michael Hardy 03:48, 4 September 2006 (UTC)

Properties would be nice
There are a whole bunch of properties of MGFs that it would be nice to include -- e.g. the MGF of a linear transformation of a random variable, MGF of a sum of independent random variables, etc.

Discrete form of mgf
something should be added about the discrete form of the mgf, no? $$M_X(s)=E(e^{sX})=\sum_{k=1}^\infty{p(k)e^{sk}}$$ 24.136.121.150 08:37, 20 January 2007 (UTC)

Common Moment Generating Functions
It would seem like a good and self-evidently obvious thing to include a link to a wikipedia page which tabulates common moment generating functions (ie: the moment generating functions for common statistical distributions), placing them online. The information is already there on wikipedia, it would just be a case of organising it a little better.

Also, there is probably some efficient way in which the set of all possible functions which commonly occur when dealing with statistical distributions can be organised to highly the possibility of inter-relationships (perhaps some mgf's are nested mgfs so that the fact that $$Mgf_3(s)=Mgf_2(Mgf_1(s))$$

could be highlighted in a list of mgf interdependencies...).

ConcernedScientist (talk) 00:47, 18 February 2009 (UTC)


 * Hope the changes satisfy your concerns about the page. Apoorv020 (talk) 20:19, 10 October 2009 (UTC)

Uniqueness
We have a theorem that if two mgf's coincide in some region around 0, then corresponding random variables have same distribution. There is however a concern that this statement being true from the point of view of mathematician, is not so reliable from the point of view of applied statistician. McCullagh (1954) gives following example:
 * $$f_1(x) = \frac{1}{\sqrt{2\pi}}e^{-x^2/2}, \quad f_2(x) = f_1(x)\Big(1 + \frac{1}{2}\sin(2\pi x)\Big)$$

with cumulant generating functions
 * $$K_1(t) = t^2/2, \quad K_2(t) = K_1(t) + \ln\Big(1 + \frac{1}{2}e^{-2\pi^2}\sin(2\pi t)\Big)$$

Although the densities are visibly different, their corresponding cgfs are virtually indistinguishable, with maximum difference less than 1.34•10-9 over the entire range. Thus from numerical standpoint mgfs fail to uniquely determine distribution.

On the other hand Waller (1995) shows that characteristic function does much better job in determining the distribution.

Even from a mathematician's point of view, don't you need the MGF to be infinitely differentiable at 0 for uniqueness? Paulginz (talk) 14:45, 24 November 2009 (UTC)

Purpose
This page doesn't seem to explain the purpose of the function. —Preceding unsigned comment added by 129.16.204.227 (talk) 13:07, 10 September 2009 (UTC)


 * Hope your concerns are now satisfied. Apoorv020 (talk) 20:20, 10 October 2009 (UTC)

Thanks. Your edits very much improves the quality of the article. --129.16.24.201 (talk) 08:59, 10 November 2009 (UTC)


 * I would say that the topic of Moment Generating Functions falls under "Subject-specific common knowledge" so adding inline citations is just really unnecessary. Any intro college level probability course will cover this article as it exists now pretty exhaustively.
 * Mike409 (talk) 11:17, 3 February 2010 (UTC)


 * No such thing as "Subject-specific common knowledge" ... citations are always needed. Melcombe (talk) 13:56, 3 February 2010 (UTC)

Kimaaron (talk) 21:34, 20 February 2010 (UTC)
 * I added a section called "Why the moment-generating function is defined this way". If you want to rename that as "Purpose" go right ahead.

raw moments vs. central moments.
The MGF is calculating the raw moments as opposed to the central moments. It already says "moments around the origin" on the page, but perhaps this should be noted explicitly at the end of the introduction ("The moment generating function is named as such, because of its intimate link to the raw moments") and again in the definition (only inserting raw), f.ex. : ...where mn is the nth raw moment. Currently the first definition on the moment page is of central moments, so a bit of confusion could occur. /Jens — Preceding unsigned comment added by 203.110.235.1 (talk) 23:55, 21 January 2013 (UTC)

Too much a mix of lay and advanced knowledge
Under "definitions" further up on this page, there is some concern that the article is not transparent to someone trying to learn about moment generating functions. Michael Hardy writes, "It is written for people who already know what probability distributions are and the standard basic facts about them." While this is true, I think it jumps somewhat between requiring only this basic amount of knowledge and requiring more substantial knowledge. I would imagine many of my students in my intro probability course would find this article impenetrable at enough places that they would give up trying to learn something from it. I would think it would be better to put anything that requires knowledge from courses at a comparable level of difficulty or more advanced than an intro probability course at the end. Specifically, I think it would improve the article to: Does anyone feel that these changes would not be an improvement?Barryriedsmith (talk) 14:23, 6 October 2015 (UTC)
 * 1) Move the definition of multivariate moment generating functions and the corresponding vector notation.  Also move the vector-valued case under "calculation" to be next to it.
 * 2) Don't make the computation in the "general case" the first item under computation.  Move anything concerning Riemann-Stieltjes integration further down.
 * 3) Move the theorem about equal moment generating functions implying CDFs equal at almost all points down -- the reference given, Grimmett (and Welsh), doesn't even state it in that generality.  Instead, give the restricted version actually given in Grimmett and Welsh and talk about the more general "outside a set of meaure 0" case further down.

Assessment comment
Substituted at 20:07, 1 May 2016 (UTC)

Incorrect estimate in the section "Other Properties"
Consider


 * $$\Pr[X \ge t] \le \frac{E[X^k]}{t^k} \le \frac{k!}{t^{2k}} M_{X}(t) \le e\sqrt{k}\frac{k^k}{e^k t^{2k}} M_X(t) \le et\, e^{-t^2} M_X(t),$$

under "Other Properties". The estimate


 * $$\frac{k!}{t^{2k}} \le et\, e^{-t^2}$$

is simply false for many values of k and t (k fixed, t large). 178.38.132.48 (talk) 18:00, 4 December 2017 (UTC)


 * I believe the idea is that the second last expression is valid for any k, so you can choose $$k=t^2$$ to get the final expression. I'm somewhat dubious about the case when X is not non-negative and k is not integer. McKay (talk) 03:06, 11 December 2017 (UTC)

Existence issues
The article claims that the Expectation must exist in order for the MGF to exist. This is not true. The expectation can exists (e.g., Cauchy, or Log-Normal - where it's positive infinite) but not be finite - and then there is no MGF.

Also, the link in the Cauchy to Indeterminate form is wrong - since for Cauchy the expectation of e^tX is positive infinite. No indeterminate issues here. — Preceding unsigned comment added by 109.186.33.244 (talk) 14:44, 30 November 2021 (UTC)