Talk:Law of the unconscious statistician

Examples
Why are there no examples???? — Preceding unsigned comment added by 2003:E5:1F0C:364F:1419:3A91:707D:104B (talk) 07:57, 29 May 2023 (UTC)

Start
Oops, the Mar 22 edit by 68.163.167.174 was me (forgot to log in). The original version of this page had no explanation and simply gave two equations (which were the definition of the expected value--someone copied the wrong equations from the reference). I added some explanation and an example. The theorem is simple so that should be enough for people to understand it, though including the proof of it may be a nice addition. digfarenough (talk) 23:38, 22 March 2009 (UTC)

Etymology
Anyone know why it is called this? Btyner (talk) 19:28, 4 April 2009 (UTC)


 * see DeGroot and Schervish, p. 214, first paragraph. "Theorem 4.1.1 is called the law of the unconscious statistician because many people treat equations (4.1.9) and (4.1.10) as the definition of E[r(X)] and forget that the definition of the mean of Y=r(X) is given in Definitions 4.1.2 and 4.1.4." Revgraves (talk) 02:35, 24 October 2013 (UTC)


 * ...or whether is is called that? I haven't encountered the name before.  The rest of this is just routine stuff. Michael Hardy (talk) 01:51, 5 April 2009 (UTC)


 * OK, I'm finding the phrase in various textbooks. I've taught this thing in classrooms a number of times, but never encountered that name for it. Michael Hardy (talk) 01:53, 5 April 2009 (UTC)


 * Yes, and IMHO such a trivial result hardly merits its own "law". But if you come across a reference for how it came to be known as this, I think it would be worth including in the article. Btyner (talk) 03:14, 5 April 2009 (UTC)


 * Some good coverage starting on page six of, including a brief note on the etymology. Perhaps we should move to "Expectation of a function of a random variable" ? Btyner (talk) 03:23, 5 April 2009 (UTC)


 * I concur. This is standard stuff for computing expectations.  I spoke with a few PhD statisticians -- and none of them had seen or heard of this "Law."  When I explained it, they insisted that was not a law -- but basic for computing expectations.  Further, they noted, this neglects issues of measurability, continuity, and boundedness.  Just because Ross uses the term does not make it a law; I'd expect much more wide use in the statistical (not just OR) community for this to stay.--Cumulant (talk) 03:21, 12 August 2009 (UTC)


 * The name isn't used in the OR community either, AFAIK. It's just routine stuff; let's delete/merge into the article on expectation. Shreevatsa (talk) 07:39, 13 November 2009 (UTC)

(outdent) "Further, they noted, this neglects issues of measurability, continuity, and boundedness." How does it "neglect"? That is the point of the statistician's unconsciousness, I believe and the acronym LOTUS alludes to the same unconsciousness, or is it a transcendental state of bliss. The statistician doesn't need to know the math because the mathematical stumbling blocks do not appear on the statistician's beaten track, only on the way of the mathematician.

A proud and theoretically inclined professional statistician might bowdlerize this the Law of the Unconscious Student (LOTUS), because the mere student statistician rests assured (by hers professor?) that no mathematical stumbling block appears in this course, while the pro must navigate those stumbling blocks in daily life ;-)

Vaguely I feel there should be some interchange of the order of two expectations involved, or it it interchange of some expectations and some limit? Until reading this article (November version) today, I would have said that LOTUS may be the law of iterated expectations? [I might be able to explain by reference to some lecture notes where LOTUS is ubiquitous, my notes as a student, which are not at hand. That would not be for wikipedia mainspace, however, as its referent is unpublished.)

Law means many things to many people. If that's a problem then call it the Lesson or the License in your next course of lectures! --P64 (talk) 23:07, 13 March 2010 (UTC)
 * Probably it is the Law of the Unconscious Statistician (LOTUS) where probability is taught in the mathematics department, but the Law of the Lucky Econometrician (LOTLE) or some such thing where probability is taught in the statistics department. Alternatively the statistics teacher is capable of a little self-deprecating humor, not one of those proud ones. --P64 (talk) 23:09, 13 March 2010 (UTC)

Ross
The only place I've seen this name is A First Course in Probability by Sheldon Ross. In my old edition (3rd), in the second exercise under "Theoretical Exercises" in the chapter on "Expectation" (chapter 7), he states:
 * There are some textbooks that "define" $$E[g(X)]=\int_{-\infty}^{\infty} g(x)f_X(x) dx$$. However, as $$g(X)$$ is itself a random variable, it is not clear that the above "definition" is logically consistent. For, it also follows from this "definition" that $$E[g(X)]=\int_{-\infty}^{\infty} y f_{g(X)}(y) dy$$, and it is not a priori evident that the two expectations for $$E[g(X)]$$ need be identical. That they are identical is, of course, the substance of the law of the unconscious statistician.

He goes on to ask the reader to show that a similar formula he calls the "S-pectation" of $$g(X)$$ (replacing $$f_X(x)$$ by its square in the first integral formula above) is logically inconsistent because it leads to a contradiction (for those who want to try this, he suggests letting X be uniform(0,1) and trying to compute $$S(X^2)$$ in two different ways). - dcljr (talk) 03:00, 7 August 2009 (UTC)
 * Hence, it would seem that the name refers to the fact that most people don't think too hard about the way the expected value of a function of a random variable is usually presented in textbooks. (BTW, the above quote is from pp. 313–314 of the aforementioned book.) - dcljr (talk) 03:08, 7 August 2009 (UTC)

I have the 2nd edition, and although Ross is generally a good author, he does not give an adequate description of expectation. Expectation is a calculation of moment; everything about it follows in some way from that simple statement, including the Law of the Unconscious Statistician. When I teach probability to engineers, I show them that many of the definitions/theorems/methods have parallels in mechanics, particularly in applications of moment ... soon after that, they begin getting passing grades. Pf416 (talk) 17:42, 15 December 2015 (UTC)

Why have this page?
I don't understand why we would want to have this page. Can someone explain. 018 (talk) 13:14, 21 January 2010 (UTC)


 * See above under Etymology: "I'm finding the phrase in various textbooks". I reverted the change to a redirect because the term isn't used (and probably shouldn't be) on the target. What's left is enough to give an idea of the meaning term and some citations for it. My feeling is, if it is not worth the present brief version then probably it shouldn't be a redirect either, but rather have the name entirely deleted ... either would fit in with the discussion above. But leaving the present version might be helpful to some. Melcombe (talk) 14:06, 21 January 2010 (UTC)


 * Melcombe, I still don't understand. I see the current page, and your rationale, this is more like a dictionary definition (really more of a glossary entry) than an encyclopedia entry. What am I missing? 018 (talk) 14:45, 21 January 2010 (UTC)


 * The point is there is no point in having a simple redirect of the (somewhat unusual) term "law of ..." if there is no mention of that term on the target page. As it stands this article at least has some citations for use of the term, but these would be intrusive in the other article. Melcombe (talk) 10:53, 5 February 2010 (UTC)


 * Okay, I've come around to your way of seeing it. 018 (talk) 14:26, 5 February 2010 (UTC)
 * By the way re: "more like a dictionary definition (really more of a glossary entry) than an encyclopedia entry":
 * Is there a Statistics or Proby & Stats glossary?
 * The page doesn't make sense without the acronym LOTUS, which is half the joke.
 * This discussion suggests that glossary entries should comply with wikipedia Article naming guidelines, or there should be compliant redirect pages that link to glossaries, so that one who searches wikipedia will find the glossary entry if there is no article. (talking to myself here) --P64 (talk) 23:23, 13 March 2010 (UTC)

Not the right place for the sketch of the proof of a measure-theoretic fact
In the section "From the perspective of measure" it is written: "This can be demonstrated by starting with simple functions, and extending the result first to bounded measurable functions, then to arbitrary nonnegative measurable functions, and finally to arbitrary measurable functions, following the usual steps for for Lesbesgue integration". That is true, but why is it here? This is the sketch of the proof of "change of variables formula" and its right place is rather in "pushforward measure", isn't it? Boris Tsirelson (talk) 13:20, 22 March 2013 (UTC)
 * It derived from text hidden in a footnote that at least had a (not particularly reliable) source. But the real question is why this exotic maths appears at all in this aticle, when it contributes nothing substantive. I'll reduce it to something readable. 81.98.35.149 (talk) 12:45, 23 March 2013 (UTC)
 * I think the lede is also not quite as readable to someome who came on this term casually. Maybe a rewrite altogether is in order? Neutralphrasing (talk) 02:26, 12 December 2013 (UTC)
 * I think the measure theory section belongs in a section with more accessible proofs; it's a discussion of a fully formal way to derive the theorem, but we didn't talk about any accessible ways of deriving the theorem. Hopefully it is less out of place now. - Astrophobe (talk) 07:37, 6 November 2018 (UTC)

Proof
LOTUS is a tremendously foundational tool in expectations, and it comes up in essentially every book and introductory course that introduces the idea of expectation. One major detail about this theorem, which is mentioned in the article and which informs its name, is that it is often treated as a definition rather than a proven theorem. However, there are straightforward proofs, at least of very slightly special cases. So, after thinking for a long while and consulting WikiProject Mathematics/Proofs at length, I decided that it would be very helpful in this page to include a short informal sketch of how LOTUS is proven. Therefore, I am about to add exactly that. - Astrophobe (talk) 07:29, 6 November 2018 (UTC)

Authoritative source for the name
What sources are there for the name besides a pdf document by Bengt Ringner that's never been published, except insofar as putting it on the author's web site constitutes a sort of publication? Michael Hardy (talk) 20:20, 25 November 2018 (UTC)
 * In the citation that I added, page 213 of DeGroot and Schervish's textbook, they write "Theorem 4.1.1 is called the law of the unconscious statistician because many people treat Eqs. (4.1.9) and (4.1.10) as the definition of E[r(X)] and forget that the definition of the mean of Y = r(X) is given in Definitions 4.1.2 and 4.1.4." I just checked the citation which was given by a previous editor, by Ross, and the name appears nowhere in that book, so I'll go remove that citation. - Astrophobe (talk) 20:46, 25 November 2018 (UTC)

Easier proof or calculation
For the continuous case, the input x in g(x) can be mapped to a linear 0 to 1 range if you solve the CDF FX(x) for x and plug x into g(x) to get g(FX). CDF's are always uniform in their 0 to 1 range. I think this is sufficient to directly write down the expected value, then use chain rule to prove the standard form.


 * $$\operatorname{E}[g(X)] = \int_{0}^{1} g(F_X) \, \mathrm{d}F_X $$

This is a lot easier to solve than the standard form for my application (feedback control of the expected value which usually needs to use FX as part of g(x) to correctly estimate the error from the target value in each sample of x). It seems equally easy as the standard form for other applications. You can use u substitution on the standard form to derive this equation. Ywaz (talk) 15:21, 1 June 2022 (UTC)
 * I'm sure improvements are possible, but any modifications will have to be cited to a reliable source. Even mathematics that might seem perfectly obvious should not be independently generated by Wikipedia editors, but needs to come from somewhere. - Astrophobe  (talk) 16:20, 1 June 2022 (UTC)