User:CPWillmot/sandbox/Introduction to Entropy (Explanation)

Introduction to Entropy (Explanation section)

Explanation
The concept of thermodynamic entropy arises from the second law of thermodynamics. Systems can change, becoming simpler (fewer options) or more complex (more possibilities). This law of entropy increase describes, in mathematical terms, a system's scope for become simpler – whether a thermodynamic process may occur. For example, heat always flows from a region of higher temperature to one with lower temperature until temperature becomes uniform.

Entropy is calculated in two ways, the first is the entropy change (ΔS) to a system containing a sub-system which undergoes heat transfer to its surroundings (inside the system of interest). The second calculates the absolute entropy (S) of a system based on the microscopic behaviour of its individual particles.

Relative Entropy
This is based on the macroscopic relationship between heat flow into the sub-system and the temperature at which it occurs summed over the boundary of that sub-system. Following the formalism of Clausius, this first calculation can be mathematically stated as:


 * $${\rm \delta}S = \frac{{\rm \delta}q}{T}.$$

Where δS is the increase or decrease in entropy, δq is the heat added to the system or subtracted from it, and T is temperature. The equal sign indicates that the change is reversible, because Clausius shows a proportional relationship between entropy and the energy flow, in a system, the heat energy can be transformed into work, and work can be transformed into heat through a cyclical process. If the temperature is allowed to vary, the equation must be integrated over the temperature path. This calculation of entropy change does not allow the determination of absolute value, only differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,


 * $${{\rm \delta}S} \ge {\frac{{\rm \delta}q}{T}}.$$

According to the first law of thermodynamics, which deals with the conservation of energy, the loss $δq$ of heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of decrease in internal energy and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.

Absolute Entropy
This is based on the natural logarithm of the number of microstates possible in a particular macrostate (W or Ω) called the thermodynamic probability. Roughly, it gives the probability of the system's being in that state. In this sense it effectively defines entropy independently from its effects due to changes which may involve heat, mechanical, electrical, chemical energies etc. but also includes logical states such as information.

The second calculation defines entropy in absolute terms and comes from statistical mechanics. The entropy of a particular macrostate is defined to be Boltzmann's constant times the natural logarithm of the number of microstates corresponding to that macrostate, or mathematically


 * $$S = k_{B} \ln \Omega,\!$$

Where S is the entropy, kB is Boltzmann's constant, and &#x3A9; is the number of microstates.

The macrostate of a system is what we know about the system, for example the temperature, pressure, and volume of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.

The concept of entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.