Talk:History of entropy

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

von Neumann, Shannon, and Entropy[edit]

Jheald, in the entropy history section you changed John von Neumann’s quotes around; in a sense, putting words in his mouth that he did not say. I would appreciate it if you would go back and replace the original quotes. Editing is one thing; changing history is another. Thanks: --Sadi Carnot 04:59, 10 April 2006 (UTC)[reply]

The quotation first appears in:

  • M. Tribus, E.C. McIrvine, Energy and information, Scientific American, 224 (September 1971).

Variants do appear on the internet, but I believe I have rendered the original correctly.

All best, Jheald 10:43, 10 April 2006 (UTC).[reply]

Version according to (Jheald):
Claude Shannon introduced the very general concept of information entropy, used in information theory, in 1948. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and the earlier work in thermodynamics; but the mathematician John von Neumann certainly was. "You should call it entropy, for two reasons," von Neumann told him. "In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
Version according to (John Avery):
An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann, who asked him how he was getting on with his theory of missing information. Shannon replied that the theory was in excellent shape, except that he needed a good name for “missing information”. “Why don’t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.”[1]
Reference
  1. ^ Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 9812384006.

Does this revised version sound better? I've cleaned it up a bit; it is sourced by Nobel Prize winning author John Avery, in what is essentially a small textbook on information theory. The chapter in which the above paragraph is copied, word-for-word, contains seven sources by Shannon, from the years '48 to '93. I hardly think that a famous 1949 story by the "father of information theory", only recently appeared in 1971, 22-years after its inception? Either we can work together to make compromise, or we can put both our versions on the entropy talk page to see what other editors think.--Sadi Carnot 16:02, 10 April 2006 (UTC)[reply]

If you want to put it to the talk page, that's fine by me. It looks to me that Avery is paraphrasing from memory the quotation in the Tribus article from 32 years earlier. Unless Avery gives a printed source for his wording of the quotation earlier than 1971, I would assume that is what happened.
Secondly, the expression "lost information in phone-line signals" is poor. Shannon entropy is much better thought about as a measure of uncertainty -- the uncertainty which is removed (or could be removed) if the recipient receives particular information.
This is also of course a very reasonable way to think about thermodynamic entropy; though it really took E.T. Jaynes to push that point of view (and its consequences for how we think about ensemble assignment).
In summary: I believe it is appropriate to go with the Tribus version of the quote, which appeared in print 32 years earlier, and IMO also reads better. -- Jheald 17:02, 10 April 2006 (UTC).[reply]
The measure of uncertainty idea sounds fine; the remaining points that I rather do not like in your recommended version are newly bolded: being that I cannot imagine a polymath as Neumann using such poor syntax (as in repeating the word “name” twice or declaring that "entropy" is a mystery). Let me know if you can re-word these?--Sadi Carnot 17:18, 10 April 2006 (UTC)[reply]
I am not going to re-word a direct quotation from a printed journal. -- Jheald 17:26, 10 April 2006 (UTC).[reply]
For context, here is an extended version of the Sci Am quotation:
“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
-- Jheald 17:44, 10 April 2006 (UTC).[reply]
I added both points of view into the article; at least until someone can get a more date [1948] accurate source.--Sadi Carnot 01:27, 11 April 2006 (UTC)[reply]

For the record, another largely similar version of the quote appears earlier in M. Tribus, "Information theory and thermodynamics", in Harold A. Johnson (ed.), Heat Transfer, Thermodynamics and Education: Boelter Anniversary Volume New York: McGraw-Hill, 1964; page 354.

"When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics... Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.

sometimes quoted in the shortened form

"It is already in use under that name and besides it will give you a great edge in debates, because nobody really knows what entropy is anyway"

Jheald 15:23, 11 July 2006 (UTC)[reply]

Yes, well in Jeremy Campbell’s 1982 book Grammatical Man – Information, Entropy, Language, and Life we find, on page 22, the beginning of chapter two (“The Noise of Heat”), the following paragraph:
At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.”
Being that this is duplicate in wording to my previous separate source mentioned, I am going to assume this version (as bolded) is most likely the original version, particularly because it sounds fluid as though it would happen in a real conversation. Thanks, though, for the source; if you find more, put them here so I can check into the book.--Sadi Carnot 16:38, 11 July 2006 (UTC)[reply]
Erm, actually it's not. It's a slight misquotation of the Tribus (1971) Scientific American source — which was the one I originally cited !!  :-) Jheald 17:34, 11 July 2006 (UTC)[reply]
Well, whatever the case, I hope all of this talk is getting us somewhere? --Sadi Carnot 03:02, 12 July 2006 (UTC)[reply]

Year entropy was coined?[edit]

From my readings, I have come across tree different supposed “dates” to when the word entropy was coined. Mendoza’s Carnot + Claperyon + Clausius compendium states that the word was coined in a 1852 paper; Perrot’s A to Z Dictionary of Thermodynamics states that it was proposed by Clausius in 1868; and Cengel’s textbook on Thermodynamics states that in 1865 he choose to name the property entropy? Additionally, I’ve also read parts of an original copy of Clausius’ 1860 book (at the sacred copies room at the UIC library), with un-cut pages (believe that), and it might have the word entropy in it? I’m still digging around; if anyone has any tips for me leave them here.--Sadi Carnot 01:23, 11 April 2006 (UTC)[reply]

FYI, I found the answers I was looking for:

(1850) - stated that an expression was needed to account for the experimental fact that "loss of heat occurs when work is done." (as Carnot had assumed did not occur).
(1854) - he defines the ratio Q/T and calls it "equivalence-value" (so to have relation to Joule's 1843 paper Mechanical equivalent of heat)
(1856) - calls it "equivalence-value of all uncompensated transformations involved in a cyclical process" (and gives it the symbol -N)
(1862) - he relates the integral of dQ/T to something he calls "disgregation" of the body having relation to arraignment of the molecules of the working body
(1865) - lets dS = dQ/T and first calls S the "transformation-content" of the working body, but then changes it to "transformational-energy", or entropy, so to have similarity to the word energy.

See:

  • Mechanical Theory of Heat – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius [1850-1865]

Adios:--Sadi Carnot 15:19, 3 September 2006 (UTC)[reply]

Repetition?[edit]

Just wondering if the content is repetitive in places, especially in the sections "Historical Definitions" and "Classical Thermodynamic Views". This gives an article that flows back and forth. As it is an article on history, maybe having it present in a chronological frame is an good option?

section 'Classical thermodynamic views'[edit]

The section of the article headed 'Classical thermodynamic views' contains the following:

"An early formulation of the Second Law by Thomas Aquinas, in the Summa Theologica (1274), is: "It is impossible for an effect to be stronger than its cause.".<http://www.newadvent.org/summa/2029.htm#article3> Here, "be stronger than" in modern terminology corresponds to "have less entropy than." Another early formulation is that "a cause must be equal to or greater than its effect."<http://www.jstor.org/discover/10.2307/4181986>"

I think it is wrong to say that Aquinas' statement is an early formulation of the second law. I think the first of the sentences just quoted should be deleted.

The claim that "be stronger than" corresponds to "have less entropy than" is fanciful and inaccurate. Thomas is talking about moral things in the cited source. It is taking the source out of context to apply the doctrine to thermodynamics.

The claim about Descartes' view is contextual. Descartes' reason is about what today we would call transfer of energy as heat. That is indeed relevant to the second law. It is still at best only marginally reasonable to say that Descartes was anticipating the second law, because he offers, alongside the argument about heating, also an argument that something that is not blue cannot cause something to turn blue; this is not about entropy or the second law.

If one really wants to find an early indication of the second law in context in the classical literature, one can find it in Aristotle's physics. His usually ridiculed view, that a body needs a driver to keep it in motion, is hardly different from the currently accepted fact, that the motion of a body is always accompanied by friction with the medium in which it moves. This is dissipation of energy as observed by Kelvin, and is one of the present-day recognized mechanisms of entropy production, which is the basis of the second law. The other main kind of mechanism of entropy production is dispersal of matter. It is fair to point out that Aristotle did not identify friction as the reason for the need for a driver. I am not proposing to put this into the article. I am just proposing to delete the sentence about Thomas.Chjoaygame (talk) 17:19, 27 March 2014 (UTC)[reply]

Another point. The claim that it is impossible for an effect to be stronger than its cause is vague and hard to interpret. What about the butterfly effect? This had been recognized in physics for over a century. A small fluctuation here can trigger a giant effect there, provided there is enough other stuff there to feed the giant effect. One might say that this refers more to the first than the second law of thermodynamics. Is triggering a matter of causation? The context matters very much. A brief discussion of the butterfly effect is in the March edition of Physics Today.Chjoaygame (talk) 11:02, 28 March 2014 (UTC)[reply]

entropy[edit]

What about entropy? Who can give the right definition for it. Fabin Benny (talk) 05:47, 21 February 2017 (UTC)[reply]

Clausius is misquoted[edit]

This quote from Clausius:

...then the generations of the quantity of heat Q from work at the temperature T, has the equivalence-value: Q/T

is a misquote. The temperature is lower-case t, and T stands for "a function of the temperature, independent of the nature of the process..." as quoted a bit later.

another quote:

"T is now the unknown function of the temperature involved in the equivalence-values"

my emphasis.

A few pages later:

According to this, T is nothing more than the temperature counted from a°, or about 273° C. below the freezing-point; [...] T is simply the absolute temperature.

— Preceding unsigned comment added by Dotanrs (talkcontribs) 15:10, 28 June 2019 (UTC)[reply]

Rankine and entropy[edit]

I came across a curious piece of history, which is that William Rankine and Clausius were converging on entropy around the same time. Rankine gave it the rather poor name "thermodynamic function", and Clausius certainly beat him in the marketing department by coining "entropy" later on in the 1860s. Clausius was also more general and rigorous, it sounds like. The Consensus certainly is that clausius does deserve the main bit of credit.

(I noticed this in Maxwell's book Theory of Heat: when introducing entropy midway through, he credits Clausius primarily but also gives credence to Rankine. See also Hutchinson 1981 "W. J. M. Rankine and the Rise of Thermodynamics" https://doi.org/10.1017/S0007087400018264 . )

It could be an interesting topic for inclusion on this article. Science history isn't my forte, so I'll leave it at that. --Nanite (talk) 13:30, 11 July 2020 (UTC)[reply]