Talk:Entropy/Archive 14

Conserved quantity?
I read in the famous book "The first 3 minutes" that conservation of energy and entropy "per particle" is central to the standard model of the big bang theory. A PhD cosmologist also told me it appears to be conserved because there is no heat transfer on the large scale. He said he always assumes it's conserved but that technically it's an open question. There is no such thing as an isolated system (see Feynman) because there is always black body radiation which could explain the need to refer to "isolated" systems when "proving" entropy is not conserved. It appears that as space expands with time entropy is emitted to keep the entropy per comoving volume of space constant, which we inaccurately see as entropy increasing in an "isolated" system. Is there an example of entropy increasing that can't be shown to eventually result in black body radiation? Ywaz (talk) 12:47, 24 December 2019 (UTC)

Tribo Fatigue part of the article is suspicious
In the "interdisciplinary applications of entropy" section of the article there are numerous explanations associated with something called "tribo-fatigue entropy", however, all sources refer to papers and books by the same author, a russian scientist called Sosnovskiy, and I have not been able to find any other authors who discuss or mention this concept. I am not exactly sure of how to flag this, I would flag it as "Non mainstream science" or something like that, maybe "fringe source" would be appropriate, in any case, I seriously think we should consider deleting this section from the article. IgnacioPickering (talk) 00:31, 27 February 2020 (UTC)

I maintain that the section should be deleted. I tried to find mainstream scientific reliable sources for this part of the article but couldn't find any. It seems to me that this particular interpretation is not widely accepted by mainstream science. I suspend judgement about its correctness but I believe it should be deleted from the wikipedia page. If no one is able to find any other sources I will delete this part of the page in a couple of days. IgnacioPickering (talk) 17:10, 15 May 2020 (UTC)


 * Thanks for spotting this. I agree tribo-fatigue is of undue weight and I could not find any independent sourcing. I will boldly delete this. -- 17:28, 15 May 2020 (UTC)

Today, May 16, I accidentally discovered a promise made on May 15 to remove the text on tribo-fatigue entropy in the “Interdisciplinary applications of entropy” section in an Entropy article two days later, i.e. May 17, if no one answers the comments made. It turned out, however, that this text was deleted on the day of the promise, i.e. May 15. Why such a haste? I will express my opinion ahead of schedule, May 16. I bring to your attention information that, I hope, will be taken seriously. I'll start with the concept. Classical thermodynamic entropy is a characteristic of scattering. And here is a new view. Tribo-fatigue entropy is the absorption of energy in a solid deformable body. Analogy: external and internal friction. And here are just some additional sources with the necessary links:

1. JY Jang, M Mehdizadeh and MM Khonsari. Nondestructive estimation of remaining fatigue life without the loading history // International Journal of Damage Mechanics. 0(0) 1–21. 2019. doi: 10.1177/1056789519860242 2. Junhong Zhang, Xi Fu, Jiewei Lin, Zhiyuan Liu, Nuohao Liu and Bin Wu. Study on Damage Accumulation and Life Prediction with Loads below Fatigue Limit Based on a Modified Nonlinear Model // Materials 2018, 11, 2298; doi:10.3390/ma11112298. 3. Huimin Zhao, Rui Yao, Ling Xu, Yu Yuan, Guangyu Li and Wu Deng. Study on a Novel Fault Damage Degree Identification Method Using High-Order Differential Mathematical Morphology Gradient Spectrum Entropy // Entropy 2018, 20, 682; doi:10.3390/e20090682. 4. Hong-Hu Chu, Humaira Kalsoom, Saima Rashid, Muhammad Idrees, Farhat Safdar, Yu-Ming Chu and Dumitru Baleanu. Quantum Analogs of Ostrowski-Type Inequalities for Raina’s Function correlated with Coordinated Generalized Φ-Convex Functions // Symmetry 2020, 12, 308; doi:10.3390/sym12020308. 5. Noushad Bin Jamal M, Aman Kumar, Chebolu Lakshmana Rao and Cemal Basaran. Low Cycle Fatigue Life Prediction Using Unified Mechanics Theory in Ti‐6Al‐4V Alloys // Entropy 2020, 22, 24; doi:10.3390/e22010024. 6. Maruschak, P. O., Panin, S. V., Zakiev, I. M., Poltaranin, M. A., Sotnikov, A. L. Scale levels of damage to the raceway of a spherical roller bearing, (2015), doi:10.1016/j.engfailanal.2015.11.01. 7. Zaleski, K. The effect of vibratory and rotational shot peening and wear on fatigue life of steel. Eksploatacja i Niezawodnosc – Maintenance and Reliability 2017; 19 (1): 102–107, http://dx.doi.org/10.17531/ein.2017.1.14. 8. Jundong Wang and Yao Yao. An Entropy Based Low-Cycle Fatigue Life Prediction Model for Solder Materials // Entropy 2017, 19, 503; doi:10.3390/e19100503. 9. Xiong Gan, Hong Lu and Guangyou Yang. Fault Diagnosis Method for Rolling Bearings Based on Composite Multiscale Fluctuation Dispersion Entropy. Entropy 2019, 21, 290; doi:10.3390/e21030290. 10. Gao Wanzhen, Jin Xuesong, Liu  Qiyue. Overview of Progress in the Study of Wheel-Rail System in China using Tribology and Tribo-Fatigue Methods // Tribo-Fatigue : proc. of VI Intern. Symposium on Tribo-Fatigue ISTF 2010, Minsk, oct. 25 – Nov. 1, 2010. : in 2 p. / Belarusian State University ; ed. board. : M. A. Zhuravkov (prev.) [et al.]. – Minsk : BSU Press, 2010. – V. 1. – P. 165–175. (in Russian). 11. Stodola, J. Up-to-Date Tribo-Fatigue tests possibilities and methods // Advances in Military Technology. – 2012. – V. 7. – № 2. – P. 5–15. 12. Makhutov, N. A., Gadenin, M. M. Research of the resource of elements of a thermonuclear power plant taking into account the parameters of Tribo-Fatigue // Mechanics of machines, mechanisms and materials. – 2017. – No. 3 (40). – P. 33–40. (in Russian). 13. Sosnovskiy, L. A., Sherbakov, S. S. Mechanothermodynamic Entropy and Analysis of Damage State of Complex Systems. Entropy 2016, 18, 268. 14. Sosnovskiy, L. A., Sherbakov, S. S. A Model of Mechanothermodynamic Entropy in Tribology. Entropy 2017, 19, 115. 15. Sosnovskiy, L. A., Sherbakov, S. S. On the Development of Mechanothermodynamics as a New Branch of Physics. Entropy 2019, 21, 1188. 16. Sherbakov, S. S., Zhuravkov, M.A. Interaction of several bodies as applied to solving tribo-fatigue problems. Acta Mech. 2013, 224, 1541–1553. 17. Sherbakov, S. S., Zhuravkov, M. A., Sosnovskiy, L. A. Contact interaction, volume damageability and multicriteria limiting states of multielement tribo-fatigue systems. In Selected Problems on Experimental Mathematics; Wydawnictwo Politechniki Slaskiej: Gliwice, Poland, 2017; pp. 17–38. 18. Sherbakov, S. S. Spatial stress-strain state of tribo-fatigue system in roll–shaft contact zone. Strength Mater.2013, 45, 35–43. 19. Sosnovskiy, L., Sherbakov, S. Mechanothermodynamics; 2016; ISBN 978-3-319-24979-7. 20. Sosnovskiy, L. A., Sherbakov, S. S. Mechanothermodynamic entropy and analysis of damage state of complex systems. Entropy 2016, 18, 1–34. 21. Sosnovskiy, L. A., Sherbakov, S. S. Mechanothermodynamical system and its behavior. Contin. Mech.Thermodyn. 2012, 24, 239–256. 22. Sosnovskiy, L. A., Senko, V. I. Tribo-Fatigue; 2005; ISBN 3-540-23153-6. 23. Sosnovskiy, L. A., Senko, V. I. Tribo‐Fatigue; In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, Tribology, Orlando, Florida, USA 2005, 141–148. 24. Sosnovskiy, L. A., Bogdanovich, A. V., Yelovoy, O. M., et al. (2014) Methods and main results of tribo-fatigue tests. International Journal of Fatigue 66: 207–219. 25. Sosnovskiy, L. A., Komissarov, V. V., Sherbakov, S. S. A method of experimental study of friction in a active system. Journal of Friction and Wear 2012;33:136-145.

Let me also remind you that back in 1996, 17 well-known scientists from Russia, Belarus and Ukraine wrote the book “A Word on Tribo-Fatigue” [«A Word on Tribo-Fatigue», edited and compiled by A. V. Bogdanovich, Authors: Strazhev, V. I., Frolov, K. V., Vysotsky, V. S., Troshchenko, V. T., Sosnovskiy, L. A., Makhutov, N. A., Kukharev, A. V., Gruntov, P. S., Starovoytov, E. I., Marchenko, V. A., Koreshkov, V. N., Shurinov, V. A., Botvina, L. R., Drozdov, Yu. N., Gorbatsevich, M. I., Pavlov, V. G., Efros, D. G., – Gomel•Minsk•Moscow•Kiev: Remika. – 1996. – 132 p. (in Russian).]. Let me also remind you that to date, the proceedings of seven International Symposiums on Tribo-Fatigue (ISTF) have been published, which took place in Minsk, Moscow, Beijing and other cities. In them you can find a very large number of articles on the issue under discussion. Of course, articles on Wikipedia give preference to links to the work of the author of the concept. This is natural, because it is he who can present this or that scientific result in the most qualified and understandable way (in simple language). This, as I understand it, is what is required for Wikipedia articles. Taking into account that the required information has been provided ahead of schedule, the text on tribo-fatigue entropy should be returned to the «Interdisciplinary applications of entropy» section. If there are any other questions, please ask them to me (preferably today or tomorrow). Barejsha02 (talk) 19:28, 16 May 2020 (UTC)


 * Wikipedia is not a place for self-publishing. The author seems to have a strong conflict of interest (WP:COI) with the subject matter and the essentially sole author(s) of the many primary references used in support. If a topic has no secondary sources, it may not be notable (WP:Notability) for inclusion in Wikipedia. Inclusion in this general article for entropy appears entirely WP:UNDUE and perhaps even WP:FRINGE. It should remain deleted. Kbrose (talk) 21:17, 16 May 2020 (UTC)

“Wikipedia is not a place for self-publishing ...” That's right! But Wikipedia is not a place for narcissism: you did not read any explanations !? I am truly sorry for Wikipedia readers who cannot get up-to-date information on tribo-fatigue entropy. — Preceding unsigned comment added by Barejsha02 (talk • contribs) 15:17, 20 May 2020 (UTC)

Entropy is not energy
The word energy appears 68 times in the article. In places the article goes close to saying, or implying, that entropy is energy. For example, in the section Entropy it says "He called this non-usable energy 'Entropy'”.

This is misleading. Entropy is not energy; they don’t even have compatible units: The SI unit of energy is the joule whereas the unit of entropy is the joule per kelvin.

The article should be refined to remove explanations of entropy that suggest it is energy. Dolphin ( t ) 12:23, 25 October 2020 (UTC)


 * Sometimes that happens in these articles. I've recently spent a great amount of time erasing all implications of conservation of mass from the mass-energy equivalence article, which spent paragraphs misinforming readers of the nature of mass conservation, implying many times that it is always conserved. This is, of course, absurd. Mass is not conserved and entropy is clearly not a form of energy. Please feel free to get to work on the article and try to make it a little more clear and less confusing for lay-readers (or at the least, less inaccurate). Thanks! Footlessmouse (talk) 17:12, 25 October 2020 (UTC)


 * I have made a start - see my diff. Dolphin ( t ) 05:08, 31 October 2020 (UTC)

Wikipedia:Use plain English
Just the first paragraph borders on gibberish for most people. --Wikipedia Wonderful 698-D (talk) 00:10, 6 November 2020 (UTC)

Notation for expected value
I've made a change to the notation for expectation value in one of the derivations (I forget exactly where, but I think it was looking at different forms of the Gibbs entropy). Since it was reverted and then I just re-reverted, I thought it'd be a good idea to put a note here so that we don't have too much back and forth on an admittedly minor issue. In the context of physics, I've almost exclusively seen either bracket notation (e.g. $$\langle x\rangle$$) or bar notation if it's a simple expression (e.g. $$\bar{x}$$); subscripts, etc. might also be used to denote an average quantity. I've never seen $$\operatorname{E}(x)$$ or $$\operatorname{E}[x]$$, probably because it's too easy to get this mixed up with energy (as I initially did until I read the text more carefully). Since this is an article about a physical rather than a mathematical idea, I'd suggest that the physics notation is more appropriate here. (Especially since the concept of entropy is so closely related to energy.) DrPippy (talk) 18:58, 30 November 2020 (UTC)

the quantity and name 'entropy'
I have made an edit for which the following quotes from Truesdell and Brush are relevant.

Truesdell

Page 215:
 * If we look only upon the positive aspects of the papers of CLAUSIUS
 * and RANKINE published in 1850, we may summarize their achievements as
 * follows:
 * 1. CLAUSIUS constructed the thermodynamics of ideal gases; for those
 * gases he discovered the internal energy. He took HOLTZMANN's Assertion
 * as one of his assumptions regarding ideal gases.
 * 2. RANKINE obtained the basic constitutive restrictions of the thermo-
 * dynamics of fluids; he expressed them in terms of a function that differs
 * only inessentially from the entropy. He proved HOLTZMANN's Assertion
 * as a theorem about ideal gases.
 * CLAUSIUS was to come upon the entropy in his own way many years later,
 * and years after that he was to coin the name.

Pages 217–218:
 * Had RANKINE separated his phenomenology from his vortices, his work
 * would have been short, clear, and final. His basic phenomenological formulae
 * (SG.1) and (8G.2) suffice to deliver the entire formal structure of classical
 * thermodynamics, freed of CLAUSIUS' restriction to ideal gases and "sub-
 * sidiary hypotheses" regarding them. Had RANKINE been content to analyse
 * and develop those formulae, he would have earned the rank of first, best,
 * and entire discoverer of classical thermodynamics, leaving to CLAUSIUS the
 * honor of codiscovery of the thermodynamics of ideal gases. History was to
 * be otherwise. RANKINE has gained respect but not factual recognition; his
 * discoveries have never until now been disentangled from his illusions. To
 * read one of RANKINE's papers today requires much patience and much
 * training. CLAUSIUS' first paper, despite its poor organization, vague ex-
 * position, and insecure mathematics, is deservedly regarded as a classic
 * second only to CARNOT's treatise.

In a footnote on page 217, Truesdell writes
 * [2] CLAUSIUS [1863, §5] justly defends his own presentation as superior to RANKINE's:
 * "I laid very particular weight upon basing... my development... not upon special
 * aspects of the molecular nature of matter but only upon general fundamental
 * principles .... "

Referring to Rankine's 1851 paper, written after Rankine had read Clausius' 1850 paper, Truesdell writes on page 223:


 * ...to within choice of $$\theta_0$$ the function $$H$$ is what
 * CLAUSIUS later, much later, was to rediscover and call the entropy. Therefore
 * (11) shows that to within a constitutive function of temperature, the heat-
 * potential is the entropy. Moreover, RANKINE knows how to use it!

Brush

Brush dates Clausius' use of the quantity to 1854. On page 576, Brush writes:
 * It would seem that one should date the discovery or invention of
 * the entropy concept from this 1854 paper, since the change in terminol-
 * ogy from "equivalence-value of a transformation" to "entropy" can
 * have no effect on the physical meaning of the concept itself.

The term 'entropy' was invented by Clausius and published in 1865. On page 577, Brush writes:
 * Presumably it was this experience that encouraged him to
 * replace the original clumsy phrase by a handy new one, and so in 1865 we
 * see at last the famous term "entropy" introduced for the first time by the
 * equation $$\mathrm d S = \mathrm d Q/T$$.

Chjoaygame (talk) 06:14, 11 March 2021 (UTC)

Scope of article
claims that [of entropy]. Except for the last paragraph of the lead, the restored section and § Approaches to understanding entropy § Information theory, this article is entirely about thermodynamic entropy. Entropy (thermodynamics) redirects here. I would suggest that this article ideally belongs under Entropy (thermodynamics). The thermodynamic concept should have an article entirely to itself. —Quondum 20:36, 29 April 2021 (UTC)

Entropy (disambiguation) gives a nice indication of how the application of the word "entropy" splits. There is no "general concept" of entropy; it is only that the term gets used in thermodynamics and in information science to describe two similar but incompatible quantities. As such, it should be clear that these are two different quantities; they have different definitions, different units, and are applied in completely different domains. It would be a challenge to translate the one into the other. I think we should recognize that these are (from the perspective of WP) only homonyms. No article should include both in a shared scope. —Quondum 20:47, 29 April 2021 (UTC)

There is enough reliable source material available, that entropy in thermodynamics, statistical mechanics, and information theory are actually not distinct subjects, they are all related to the concept of missing information. The difference in units between thermodynamic entropy and information theory is merely a choice of unit systems, as has been shown in those sources. kbrose (talk) 22:27, 29 April 2021 (UTC)


 * The text in the body of the article indicates that this is contentious. Thus, a statement in the lead to the effect that they are not distinct seems inappropriate.  —Quondum 22:42, 29 April 2021 (UTC)

obliterating a useful distinction under the loosely worded edit note "c/e"
This edit, with the edit note rm "itself", the quantity cannot do anything itself, misc c/e has obliterated the distinction between the quantity and the term that belong to the notion that Clausius considered. The edit has conflated the quantity and the term in the word concept. The edit has further clouded the picture by removing the fact, explicitly emphasised by the source Truesdell, that Rankine (usually, and in the sources, referred to by that name, not the "Macquorn Rankine" of the edit) knew how to use the quantity, not merely "referring to" it as written in the edit. Moreover, the edit note offers the otiose remark that the quantity cannot do anything itself. Whatever the quantity itself can or cannot actively do, it can itself be passively used by a writer.

The edit also removes the substantial fact that Clausius used the quantity in 1854.

The edit also, without justification, removes detail given in the source, that "[Clausius] did introduce in 1862 the concept of "disgregation," defined as a quantity dependent on molecular arrangements".

Moreover, the edit indicates the Greek etymology as if it is referred to in the Brush source, which it is not.Chjoaygame (talk) 18:46, 3 May 2021 (UTC)


 * Clausius' activities in 1864 vs. 1865 are completely irrelevant in the context of an article lede, which is supposed to summarize the article in its major points. Write a useful section first, if you like. A lede should not even have to provide all the detailed references for its statements. Similarly, the distinction "quantity" vs. "concept" is nothing but hair-splitting in this context; in the 1850s and 60s, this was all more or less an idea or phenomenon without rigor, and it was only Clausius who gave it a lasting definition and identity, which is really the major point in the historical record. It is common practice in articles to name a person first with given and surnames, and later only by surname; what others do elsewhere is not really relevant here. WP also doesn't often use common qualifiers such Dr., Prof., Lord, etc., very common elsewhere. Disgregation was not removed, but I did notice that it does not occur anywhere else in the article. kbrose (talk) 19:51, 3 May 2021 (UTC)

Explaining Clausius' Thermodynamic EntropyJames Putnam (talk) 02:31, 21 June 2021 (UTC)James A Putnam
Entropy was discovered and defined by Clausius. It has been defined, but has never had its definition explained. The common practice is to mention Clausius as the discoverer, misrepresent his definition of S = Q/T as dS = dS2 - dS1 = or > zero, and avoid explaining S=Q/T. Rather, the common official practice is to quickly move on to other types of invented entropies which are easy to explain because they are formed out of common knowledge. I propose that Clausius' thermodynamic entropy can be explained after the property of temperature is finally formally defined by the historical method of writing an equation that expresses the property in terms of other properties that have been previously introduced to us by direct empirical evidence. Clausius did this; however, the lack of such a defining equation for the property of temperature is what prevents Clausius' entropy from being physically explained. The equation can be formed but not with references since all references avoid direct answers to what is physically meant by both temperature T and by entropy S=Q/T. The explanation also involves formally mathematically defining and, thereby, also explaining mass. Their definitions result from following the leads provided to us by their direct empirical evidence.James Putnam (talk) 02:31, 21 June 2021 (UTC)James A Putnam

undoing a good faith edit
I am undoing this good faith edit because it is faulty in several respects.

Its source citation is lacking in adequate bibliographic detail. No author, no date, no publisher, no ISBN.

The edit it is a barely informed and marginally accurate paraphrase of the cited source, which itself makes only a passing comment without an attempt at historical discussion.

I think the source is wrong in its comment. My reason for saying this is to be found in §9B on page 236 of Truesdell's detailed and thorough historical study. Truesdell says there that Kelvin did not identify entropy in 1851. Truesdell gives the next mention of an entropy-like concept to Reech in 1853. I infer that mentioning Kelvin at this point in our article is inappropriate. If the IP editor likes to do some research on this topic, perhaps we may be enlightened by it.Chjoaygame (talk) 23:43, 2 November 2021 (UTC)

Proposal to change log to ln
The article does not specify the base of the logarithm. This is only necessary when giving the value of Boltzmann's constant, which has different values for different bases. One of the links in the article was to an article that gives the base as 2, but the value given in this article is for base e.

I know that Boltzmann's (relocated) headstone gives the equation as k log(W), which is fine as long as one does not give a value for k. Since this article does give a value for k, I propose changing log to ln to correspond to the given value of k.

An alternative would be to state the base at the one place where a value for k is given. Something to the effect of "Assuming natural logarithms, i.e. when the base is e, the value of k is ..." Vaughan Pratt (talk) 19:34, 18 January 2022 (UTC)


 * Changing it to ln makes a lot of sense, and makes the meaning crystal-clear, otherwise k would need to be restated as k/ln(base) if log denoted just some general logarithm. kbrose (talk) 14:20, 19 January 2022 (UTC)

Entropy and the propagation of causality produce the arrow of time
We must elaborate on that:

Entropy and the propagation of causality produce the arrow of time

Entropy and causality should be mathematically included in mathematical formulas. — Preceding unsigned comment added by 2A02:2149:8200:6E00:1CD2:E750:1821:8A8A (talk) 12:31, 25 November 2021 (UTC)
 * For this look at Entropy as an arrow of time --Northumber (talk) 12:32, 25 November 2021 (UTC)


 * What do you mean by mathematically including entropy and causality in formulae? How would you do this? Yodo9000 (talk) 21:37, 8 November 2022 (UTC)
 * This is actually one of the Unsolved problem in physics. Since this behavior of entropy are observed statistically. The huge number of theoretical research are still undergoing to answer "Why did the universe have such low entropy in the past, resulting in the distinction between past and future and the second law of thermodynamics?" Fauzul Uzul (talk) 12:26, 14 November 2022 (UTC)

About "quantitative geometrical thermodynamics "
, this is a very new and specialized topic and does not belong in a general article about entropy. So far only the originators of the idea have published peer reviewed papers about it. An article by a journalist based on the primary sources is not the type of secondary source we need. (That article describes the ideas as controversial, another reason for not including it here without scientific sources independent of the original researchers.) See Identifying reliable sources (science). StarryGrandma (talk) 19:40, 1 January 2022 (UTC)


 * I agree. This topic should have its own page rather than here. Fauzul Uzul (talk) 12:27, 14 November 2022 (UTC)

Article organization
IMHO, too much historical information is provided in the article's beginning that should be transferred to the section on history. Second is there anyway to provide a cogent layperson-accessible definition of entropy at the article's start? Currently one needs to dig far into the article to find such a cogent definition. Kaplanovitchskyite (talk) 20:06, 2 December 2022 (UTC)

Entropy is not always about disorder. It is about unbalanced energy potentials dissipating.
The original Clausius statement of the Second Law of Thermodynamics said (translated from German) "Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time." The modern statement of the Second Law (see Laws of Thermodynamics) "states that in a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems never decreases." Now, note the Clausius words "connected therewith," "occurring at the same time" and, in the modern statement, the word "interacting." There is no mention of "net" effects of non-interacting processes occurring at different times - the law only relates to one process or a set of interacting (connected and simultaneous) systems. There is no reference to temperature or heat in the modern version. Nor is there any reference to an "isolated system" because no natural thermodynamic process is exempt from obeying the Second Law - we just have to ascertain which, if any, other systems are interacting with it simultaneously. This is because entropy is all about unbalanced energy potentials dissipating. Those energy potentials can be the sum of any or all forms of internal energy including, for example, gravitational potential energy, phase change energy and other forms of potential energy such as that associated with a wound up clock spring or air compressed in a cylinder. Why refer to disorder? When there is a storm on part of a large lake there will subsequently be an increase in entropy as conditions calm down and gravity spreads the new rain water over the whole lake. The resulting calm, still lake represents the state of maximum entropy - is that disorder?

In regard to the article's reference to climate change, my paper "Planetary Core and Surface Temperatures" at https://ssrn.com/author=2627605 explains how the tropospheric temperature gradient seen in all planets with atmospheres is the state of maximum entropy. This is because the sum of molecular (kinetic energy + gravitational potential energy) is constant over altitude. Hence, at higher altitudes the PE is greater and so the KE is less, meaning the temperature is cooler - and vice versa for lower altitudes. This is what the brilliant physicist Josef Loschmidt explained in 1876 but was ignored by climatologists who think back radiation causes the surface to be warmer than the temperature which solar radiation supports at the so-called radiating altitude. Instead, when you understand that in calm conditions at night the observed temperature gradient is the state of thermodynamic equilibrium (maximum entropy) then you should also understand that new thermal energy absorbed in the upper troposphere from solar radiation the next morning will spread out in all directions (including downwards to warmer regions) due to gravity, just as the new rain water on that lake spread out. This can only happen because the temperature gradient is the state of maximum entropy that had been disturbed by the new energy, just like the effect of that storm on the lake. So, this is how the surface temperature rises because the whole graph of temperature v. altitude rises to a new, but parallel, position - all due to entropy increasing as unbalanced energy potentials diminish. It is thus helpful to think of entropy as being a cumulative measure of progression in this process of the dissipation of unbalanced energy potentials.

Douglas Cotton, B.Sc.(physics), B.A.(econ) et al Centre for the Refutation of False Science Author of the website http://climate-change-theory.com and seven linked papers. 2001:8003:26F3:D200:9C37:19E8:240C:9A7A (talk) 23:08, 26 February 2023 (UTC)

The definition of "entropy".
The article didn't provide a clear brief description of (modern) entropy. For those other editors out there, especially those specializing in physics articles, what is your answer?

This will improve this Top-importance article. - S L A Y T H E - (talk) 14:49, 23 April 2023 (UTC)

== Reversible process .. "in the Carnot cycle, while the heat flow from the hot reservoir to the cold reservoir represents an increase in entropy, the work output, if reversibly and perfectly stored in some energy storage mechanism, represents a decrease in entropy" ==

I don't think this is true - the entropy increase of the cold reservoir matches the entropy decrease of the hot reservoir exactly. This is because each isothermal change involves a reversible exchange of entropy between the engine and a reservoir. The work done is just work that can be used to drive a reversible process, and doesn't have associated entropy. However if all heat flows directly from hot to cold reservoirs then overall entropy increases. Dylanmenzies (talk) 16:10, 28 May 2023 (UTC)