User:David Shear/My Sandbox

(1/30/06, 10:03 pm)

The Helmholtz free energy is a modified internal energy whose change automatically includes the minimum possible entropy change in the surroundings due to the heat exchange required to maintain a system at constant temperature. It is applicable not only to homogeneous "bulk" systems, but even to the most highly structured physical bodies. The only requirement, other than constant T, is that a system be closed to matter exchange. Any undetermined constant in the energy, and therefore in the free energy, drops out in taking differences and in differentiating.

Free energy functions are Legendre transforms of the internal energy, thermodynamic state functions of a system alone. These are mathematical "tricks" designed to be useful under certain conditions, and free energies are not fundamental new physical quantities. The Helmholtz function is defined as the internal energy U minus the absolute temperature T times the entropy: A = U &minus; TS, and represents the isothermal "useful energy". The Gibbs free energy also adds the pressure times the volume: G = A + PV, where P is the external pressure and V is the volume of a system. This strategy allows one to ignore PdV work whenever this is deemed not to be "useful".

The experimental usefulness of these functions is restricted to conditions where certain variables — T, and perhaps P — are held constant, although they are mathematically useful in deriving Maxwell's relations. Work other than PdV must be carried along as the situation requires; e.g., for electrochemcial cells, or f ˑ d x work in elastic materials and in muscle contraction. Other important forms of work include stress-strain, magnetic (as in adiabatic demagnetization used in the approach to absolute zero), and work due to electric polarization. These are described by tensors.

State functions can also be evaluated for electromagnetic radiation (Guggenheim, Chapter 12; Planck), although this is often overlooked in treating the thermodynamics of molecular systems. Light, or heat radiation, is the photonic partner to matter, and at thermodynamic equilibrium is described by Planck's blackbody law.

Both A and F have been used to denote the Helmholtz free energy, while G and F have been used for the Gibbs function. IUPAC endorsed F for the Helmholtz function in 1947, followed by the International Union of Physics in 1948. As of 1997 IUPAC has dropped F altogether and uses A for the Helmholtz function.

Heat, work, and entropy
Consider the first law of thermodynamics for a system closed to matter exchange with the outside, using the sign convention that Q and W are the heat added to and the work done on a system, respectively


 * $$ dU = Q + W \,.$$

Any non-insulated system can divest itself of entropy by shedding heat, or acquire more entropy by absorbing heat, as expressed by


 * $$ Q = T (dS)_{heat} \,$$

In the general case, the entropy change inside a system will be larger than this because of irreversible processes. These may include chemical reactions, diffusion, corrosion and flaking of metals, electircal currents, viscous flow, and any number of phenomena inside either a homogenous, isotropic system, or a highly-structured one like an automobile battery. Irreversible processes are the very crux of the second law of thermodynamics, which states that all spontaneous processes create entropy. This may be expressed as


 * $$ (dS)_{internal} \ge 0 \,$$

in which the forward direction of time is implicit. This quantity asymptotically approaches zero for "reversible" or "quasi-static" processes; and, of course, is zero at equilibrium. But this is not yet the differential of the state function S, which is the sum of the two different ways in which entropy can change


 * $$ dS= (dS)_{heat} +(dS)_{internal} \,.$$

It is possible for a system to shed heat, and therefore entropy, faster than it may be produced by any exoothermic processes, including viscosity and friction, and most chemical reactions, so S will not necessarily increase. In any case, it is clear that the entire entropy change of a system is greater than Q/T


 * $$ Q \le TdS \,$$

so that


 * $$ dU \le TdS + W \,.$$

The excess on the right is T times the entropy produced. It was Clausius who enunciated the second law (1850) and originated the concept of entropy (1865), and he called this extra amount the "uncompensated heat" (Prigogine & Defay, p. 34). Since heat can be exchanged at a finite rate only down a temperature gradient, the heat flow between system and surroundings required to maintain temperature constancy itself creates even more entropy.

Processes, including chemical reactions
Let ξ be a progress variable represent the degree of advancement or extent of reaction of a process, identified as a (real or conceptual) product, one molecule of which is produced each time the reaction occurs (Prigogine & Defay, p. 18; Prigogine, pp. 4–7; Guggenheim, p. 37 & 62). Integer numbers of molecules of certain kinds are always converted into integer numbers of molecules of other kinds


 * $$ dN_i =  \sum_i \nu_i d\xi \,.$$

By convention the stoichiometric coefficients νi are taken as negative for "reactants" (which are consumed) and positive for "products", although the choice of the "forward" direction is arbitrary.


 * $$ T(dS)_{internal} = \mathbb{A} d\xi \,$$

where we introduce the symbol A and (odd) name "Affinity" for the entropy production per unit process increment (De Donder; Progoine & Defay, p. 69; Guggenheim, pp. 37 & 240). The progress variable and Affinity can be expressed either per molecule or per mole of reaction by shifting Avogadro's number from ξ to A. (T appears because the Affinity is usually defined in energy units.) If there are a number of processes such as chemical reactions going on simultaneously, as is often the case


 * $$ T(dS)_{internal} = \sum_k\mathbb{A}_k\, d\xi_k \,.$$

The expression for the entire change in S is given by the expression


 * $$ TdS= Q + \sum_k\mathbb{A}_k\, d\xi_k \,$$

in which


 * $$ \mathbb{A}_k = T (\partial S/\partial \xi_k)_{\xi_{j \ne k}, Q, rev} \,.$$

This can refer to the entire set of physically independent processes, or if desired, a subset consisting of linearly independent processes. The partial derivative should be interpreted operationally: Q means other than heat exchange and rev denotes only reversible work. Formally speaking, the subscripts should include U and all the extensive variables whose displacements correspond to the exchange of work between system and surroundings. The fact that this is difficult to picture is one reason the argument is often framed in terms of constant T and P. But the sense of the matter is better conveyed as written above.

In addition, for the preceding expressions to be correct, there can be no constraints which couple any internal process to an external displacement by which work is exchanged with the surroundings. If that occurs, the independent variable is for the work, which to the extent it is reversible contributes nothing to entropy production. This happens, for example, in an electrochemical cell, in which passage of charge in an external circuit directly controls a redox reaction, both in drawing work and in recharging. (There can, of course, be some "leakage" due to less-than-perfect coupling.)

We can now write


 * $$ dU = TdS - \sum_k\mathbb{A}_k\, d\xi_k + W \,$$

in which we have subtracted out the entropy production part of dS, leaving only the contribution of Q to dU.

If the processes are chemical reactions, we may introduce stoichiometric coefficients which specify the changes in the numbers of molecules (or moles) of the various chemical species, given the variations in the extents of reaction


 * $$ dN_i = \sum_k \nu_{ik} d\xi_k \,$$

which allows us to write dU in terms of the chemical potentials


 * $$ dU = TdS + \sum_i \mu_i dN_i + W \,$$

on the understanding that the Ni s can not be varied arbitrarily. We now have an expression for the Affinity

Legendre transforms
The strategy behind the use of Legendre transforms in thermodynamics is to shift the dependence of a state function from an extensive variable to its intensive partner by subtracting (or in the case of PV, adding) their product. Under conditions where that intensive variable is held constant, these two variables drop out of the equations. Work or heat transfer still exists, but the targeted change in the energy of the surroundings has now been tucked into the differential of a new state function for the system alone.

The differential of the Legendre transform will not necessarily cover the full amount of the quantity being ignored, but rather the smallest change consistent with keeping an intensive variable constant. For example, a system's temperature constancy could (ideally) be maintained by quasi-isothermal heat exchange with a large heat bath. If the system and the heat bath were at (virtually) the same temperature, all of the entropy lost by one due to heat flow would be gained by the other. However, real heat flow down a thermal gradient always generates more entropy, and most isothermal systems outside of the laboratory operate at a higher temperature than their molecular environment — i.e., not counting ambient light, which is very hot.

It is somewhat of an historical accident that the energy representation gained ascendency over the entropy representation in the formulation of thermodynamics. In cases where there is no "useful" (i.e., interesting) work being done, the latter would be much more instructive; although the former is, of course, germane to engineering. Legendre transforms of the entropy were invented by Massieu in 1869 and actually predated the transforms of the energy introduced by Gibbs in 1875 ... The Massieu functions [are] particularly useful in the theory of irreversible thermodynamics, and they also arise naturally in statistical mechanics and in the theory of thermal fluctuations. (Callen, p. 101)

Helmholtz free energy
The Helmholtz function is defined as


 * $$ A = U - TS \, .$$

Taking the differential


 * $$ dA = dU - TdS - SdT \,. $$

Since


 * $$ dU - TdS \le W \,$$

it follows that


 * $$ dA \le W - SdT \, $$

so that at constant temperature


 * $$ (dA)_T \le W \,. $$

Thus, A can increase at most by the amount of work done on a system. Turning this about


 * $$ -W \le -(dA)_T \,. $$

The decrease in the Helmholtz free energy is the upper limit to the amount of work of all kinds that can be done by a system under isothermal conditions. Hence the alternate name "work function" (A from the German, Arbeit, for work). This is a completely general result; no mention has yet been made of any variables which may be involved in the work exchanged by a system and its surroundings. The Helmholtz function is therefore of interest in engineering. It also has a special theoretical importance in physics since it is proportional to the logarithm of the partition function for the canonical ensemble in statistical mechanics.

The "natural variables" of A are (T,V,{ Ni }, &others...) in which the { Ni } are the numbers of molecules (alternatively, moles) of of the various kinds in the system.

Gibbs free energy
The preceding expressions are true not only for simple bulk systems of one or many components, but also for the most highly structured and intricate machines. They make no assumptions about the nature of a system except that it be closed to matter exchange with the surroundings; and, as indicated, all changes in the system are isothermal. As noted, the inequalities above are due to the production of entropy.

For compositional changes inside a system at constant T and P, the differential of the Gibbs free energy is especially useful because, in addition to subsuming any entropy transferred to (or from) the surroundings, it does the same for any ± PdV work needed to "make space for additional molecules". Hence its utility to solution-phase chemists, including biochemists, and in the study of phase transitions. It is useful to those who want to consider only non-PdV work. Because of the preponderance of solvent, most solution phase reaction systems experience only a trivial volume change at constant T.

The Gibbs free energy is defined as


 * $$ G = U - TS + PV \,$$

The "natural variables" of G are (T,P,{ Ni }, &others...). For non-bulk, structured systems, such as an electrical battery or animal muscle, there can be a number of extensive variables whose changes represent ways the system and surroundings can exchange work. However, it is common to distinguish PdV from all the other kinds of work, which we denote by W'


 * $$ W = -PdV + W' \,.$$

The minus sign is present because a volume decrease (dV < 0) corresponds to compressional work done on a system. Thus


 * $$ dG = dU - TdS - SdT + PdV + VdP \,.$$

Making the substitution


 * $$ dU - TdS \le -PdV + W' \,$$

it follows that


 * $$ dG \le - SdT + VdP + W' \,$$

so that at constant T and P


 * $$ (dG)_{T,P} \le W' \,.$$

Turning this about


 * $$ -W' \le -(dG)_{T,P}\,.$$

The isobaric, isothermal change in G allows us to ignore (quasi-reversible) pressure-volume work as well as heat exchange with the surroundings. There are no other restrictions; in particular, the system does not have to be "bulk": homogeneous and isotropic. In order for the Legendre transform "trick" to work, P must be the pressure of the surroundings at its interface with a system, the place where PdV work is exchanged. P does not have to be uniform throughout a system — which makes G a "funny" kind of state function since it depends on an intensive variable found almost nowhere within a system; and even in the surroundings, only at one particular altitude.

The hydrostatic pressure in a column of liquid varies with depth (P = &minus;ρgz), but only the pressure at the variable-height meniscus in contact with the atmosphere counts: this this is what moves if the volume of the liquid changes. There is a small ΔP across a meniscus due to surface tension, but the meniscus is part of the system. The pressure inside a contracting muscle can get very high, but this doesn't matter as the pressure of the surroundings remains approximately constant. A contracting muscle shows a volume change of perhaps one part in 105, and we could equally well use the Helmholtz function.

All forms of work other than PdV can still be pulled out of dG and exhibited; e.g., external electrical work done by an electrochemical cell. If no non-PdV work is done, then (dG)T,P is merely a stand-in for (&minus;T)(dS)internal, the entropy produced by spontaneous internal processes such as chemical reactions, crumbling of materials, intermixing of different substances, etc.

Chemical reactions
In most cases of interest there are internal degrees of freedom and processes, such as chemical reactions and phase transitions, which always create entropy, unless they are at equilibrium, or are maintained at a “running equilibrium” through “quasi-static” changes by being coupled to constraining devices, such as pistons or electrodes, to deliver and receive external work. Even for homogeneous "bulk" materials, the free energy functions depend on the composition, as do all the extensive thermodynamic potentials, including the internal energy. If the quantities { Ni } are omitted from the formulae, it is impossible to describe compositional changes.

For a "bulk" (unstructured) system they are the last remaining extensive variables. For an unstructured, homogeneous "bulk" system, there are sill various extensive compositional variables { Ni } that G depends on, which specify the composition, the amounts of each chemical substance, expressed as the numbers of molecules present or (dividing by Avogadro's number), the numbers of moles


 * $$ G = G(T,P,\{N_i\}) \,.$$

For the case where only PdV work is possible


 * $$ dG = -SdT + VdP + \sum_i \mu_i dN_i \,$$

in which μi is the chemical potential for the i-the component in the system


 * $$ \mu_i = \left( \frac{\partial G}{\partial N_i}\right)_{T,P,N_{j\ne i},etc. } \,.$$

The expression for dG is especially useful at constant T and P, conditions which are easy to achieve experimentally and which approximates the condition in living creatures


 * $$ (dG)_{T,P} = \sum_i \mu_i dN_i\,.$$

While this formulation is mathematically defensible, it is not particularly transparent since one does not simply add or remove molecules from a system. There is always a process involved in changing the composition; e.g., a chemical reaction (or many), or movement of molecules from one phase (liquid) to another (gas or solid). We should find a notation which does not seem to imply that the amounts of the components ( Ni } can be changed independently. All real processes obey conservation of mass, and in addition, conservation of the numbers of atoms of each kind. Whatever molecules are transferred to or from should be considered part of the "system".

Consequently we introduce an explicit variable to represent the degree of advancement of a process, a progress variable ξ for the extent of reaction (Prigogine & Defay, p. 18; Prigogine, pp. 4-7; Guggenheim, pp. 37 & 62), and to the use of the partial derivative ∂G/∂ξ (in place of the widely used "ΔG", since the quantity at issue is not a finite change). The result is an understandable expression for the dependence of dG on chemical reactions (or other processes). If there is just one reaction


 * $$(dG)_{T,P} = \left( \frac{\partial G}{\partial \xi}\right)_{T,P} d\xi\,$$

If we introduce the stoichiometric coefficient for the i–th component in the reaction


 * $$\nu_i = \partial N_i / \partial \xi \,$$

which tells how many molecules of i are produced or consumed, we obtain an algebraic expression for the partial derivative


 * $$ \left( \frac{\partial G}{\partial \xi} \right)_{T,P} = \sum_i \mu_i \nu_i = -\mathbb{A}\,$$

where, (De Donder; Progoine & Defay, p. 69; Guggenheim, pp. 37 & 240), we introduce a concise (but odd) name for this quantity, the "Affinity", A. The minus sign comes from the fact the Affinity was defined to represent entropy increase rather than free energy decrease. The differential for G takes on a simple form which displays its dependence on compositional change


 * $$(dG)_{T,P} = -\mathbb{A}\, d\xi \,.$$

If there are a number of chemical reactions going on simultaneously, as is usually the case


 * $$(dG)_{T,P} = -\sum_k\mathbb{A}_k\, d\xi_k \,.$$

, a set of reaction coordinates { ξ{{sub|j }, avoiding the notion that the amounts of the components ( N i}} } can be changed independently. The expressions above are equal to zero at equilibrium, while in the general case for real systems, they are negative, due to the fact that all chemical reactions proceeding at a finite rate produce entropy. This can be made even more explicit by introducing the reaction rates dξ j /dt. For each and every physically independent process (Prigogine & Defay, p. 38; Prigogine, p. 24)


 * $$ \mathbb{A}\ \dot{\xi} \le 0 \,.$$

This is a remarkable result since the chemical potentials are intensive system variables, depending only on the local molecular milieu. They cannot "know" whether the temperature and pressure (or any other system variables) are going to be held constant over time. It is a purely local criterion and must hold regardless of any such constraints. Of course, it could have been obtained by taking partial derivatives of any of the other fundamental state functions, but nonetheless is a general criterion for (&minus;T times) the entropy production from that spontaneous process; or at least any part of it that is not captured as external work. (See Constraints below.)

We now relax the requirement of a homogeneous “bulk” system by letting the chemical potentials and the Affinity apply to any locality in which a chemical reaction (or any other process) is occurring. By accounting for the entropy production due to irreversible processes, the inequality for dG is now replace by an equality


 * $$ dG = - SdT + VdP -\sum_k\mathbb{A}_k\, d\xi_k + W'\,$$

or


 * $$ dG_{T,P} = -\sum_k\mathbb{A}_k\, d\xi_k + W'\,$$



Any decrease in the Gibbs function of a system is the upper limit for any isothermal, isobaric work that can be captured in the surroundings, or it may simply be dissipated, appearing as T times a corresponding increase in the entropy of the system and/or its surrounding. Or it may go partly toward doing external work and partly toward creating entropy. The important point is that the extent of reaction for a chemical reaction may be coupled to the displacement of some external mechanical or electrical quantity in such a way that one can advance only if the other one also does. The coupling may occasionally be rigid, but it is often flexible and variable.

In solution chemistry and biochemistry, the Gibbs free energy decrease (∂G/∂ξ, in molar units, denoted cryptically by ΔG) is commonly used as a surrogate for (&minus;T times) the entropy produced by spontaneous chemical reactions in situations where there is no work being done; or at least no "useful" work; i.e., other than perhaps some ± PdV. The assertion that all spontaneous reactions have a negative ΔG'' is merely a restatement of the second law of thermodynamics, giving it the physical dimensions of energy and somewhat obscuring its significance in terms of entropy. It tends to lend credence to the mistaken impression that there is a principle of minimum energy, while in fact there is no such law of nature. When there is no useful work being done, it would be less misleading to use the Legendre transforms of the entropy appropriate for constant T, or for constant T and P, the Massieu functions &minus;F/T and &minus;G/T respectively.

Constraints
In this regard, it is crucial to understand the role of walls and other constraints, and the distinction between independent processes and coupling. Contrary to the clear implications of many reference sources, the previous analysis is not restricted to homogenous, isotropic bulk systems which can deliver only PdV work to the outside world, but applies even to the most structured systems. There are complex systems with many chemical "reactions" going on at the same time, some of which are really only parts of the same, overall process. An independent process is one that could proceed even if all others were unaccountably stopped in their tracks. Understanding this is perhaps a “thought experiment” in chemical kinetics, but actual examples exist.

A gas reaction which results in an increase in the number of molecules will lead to an increase in volume at constant external pressure. If it occurs inside a cylinder closed with a piston, the equilibrated reaction can proceed only by doing work against an external force on the piston. The extent variable for the reaction can increase only if the piston moves, and conversely, if the piston is pushed inward, the reaction is driven backwards.

Similarly, a redox reaction might occur in an electrochemical cell with the passage of current in wires connecting the electrodes. The half-cell reactions at the electrodes are constrained if no current is allowed to flow. The current might be dissipated as joule heating, or it might in turn run an electrical device like a motor doing mechanical work. An automobile lead-acid battery can be recharged, driving the chemical reaction backwards. In this case as well, the reaction is not an independent process. Some, perhaps most, of the Gibbs free energy of reaction may be delivered as external work.

The hydrolysis of ATP to ADP and phosphate can drive the force times distance work delivered by living muscles, and synthesis of ATP is in turn driven by a redox chain in mitochondria and chloroplasts, which involves the transport of ions across these cellular organelles. The coupling of processes here, and in the previous examples, is often not complete. Gas can leak slowly past a piston, just as it can slowly leak out of a rubber balloon. Some reaction may occur in a battery even if no external current is flowing. There is usually a coupling coefficient, which may depend on relative rates, which determines what percentage of the driving free energy is turned into external work, or captured as "chemical work", a misnomer for the free energy of another chemical process.