User:David Shear/Entropy of mixing

The entropy of mixing is the uncertainty about the spatial locations of the various kinds of molecules in a mixture. In a pure condensed phase, there is no spatial uncertainty: everywhere we look, we find the same kind of molecule. A single-component gas is mostly empty space, but when we do encounter a molecule, the is no doubt about what kind it is. When two or more substances are interdispersed, we may know the various proportions, but we have no way of knowing which kind of molecule is where. The notion of "finding" a molecule in a given location is a thought experiment, since we can not actually examine spatial locations the size of molecules. Moreover, individual molecules of a given kind are all identical, so we never ask which one is where. "Interchanging" two identical objects is not a real process; it does not lead to a physically distinct condition.

Assume a mixing process has reached thermodynamic equilibrium so that the final material is homogeneous. Some liquids will mix while others are immiscible. Gases always intermix since free molecules will always move into empty space. Solid mixtures can be prepared by cooling liquid mixtures, or in some cases by slow interdiffusion. Solutions are mixtures in which one component, the solvent, predominates. Many mixtures combine materials initially in different states of matter; e.g., liquids in which solids or gases are lhjkljkljhlhjkjklhjklhjkljhio*/yuio*/ertdfgjkhdfgkjhdfkgdsfhg poo poo poo putang ina mo david cook akin nalang les paul mo!!!!!!!!!!! fgjksdfghlsakdhtoghtyepowrtuiqpowetweytqwetyq[woytqero[ghfogdaf1g2df65g4984y1h7y98j4gh654jk+98kij+h987oi9ouiouil Derivations of the entropy of mixing usually begin with free energy functions and chemical potentials, resulting in elaborate and perplexing lines of reasoning done separately for gases and liquids. While energy effects show up in the enthalpy of mixing through intermolecular forces, and in molecular partition functions, it is not necessary to raise such considerations here. We are concerned only with the effect of intermingling the molecules of different chemical species, which leads to an increased uncertainty about where they are located. A single, unified explanation is possible.

Strategies
Imagine space to be subdivided into a lattice whose cells are the size of the molecules. It does not have to be a square lattice; any lattice will do, including close-packing. The molecules obey a classical exclusion principle: only one object can be in a given place at any one time. This can be traced to interatomic forces which generate the steric effects, but the use of geometrized constraints is common in theoretical physics and physical chemistry.

Boltzmann's method
We want to calculate the number of spatial patterns or configurations (microstates) that are possible given the numbers of molecules of each kind which are present. Since the fundamental (and reasonable) assumption of statistical mechanics is that each possible way of achieving a macroscopic state is equally likely, the logarithm of this number will give us the configurational entropy.

Boltzmann's equation for the entropy is


 * $$ S = k \; \log W $$

in which $$W$$ is intended to be the number of unobservable, microscopic "ways" the molecules can be assigned to different conditions or states consistent with the overall macroscopic thermodynamic condition of a system and $$k$$ is Boltzmann’s constant. (The logarithm is taken to the natural base $$e$$, and we will denote it by $$\ln$$.) While this reasoning yields the correct ideal gas equation of state, it also leads to Gibbs paradox, in which it (erroneously) appears that mixing two samples of the same kind of gas leads to an increase in entropy. We will apply the same combinatorial formula, but only to spatial arrangements, and obtain correct results while avoiding Gibbs paradox. $$W$$ will be the number of "ways" the molecules of the different kinds can be arranged in space, and its logarithm will give us the spatial uncertainty introduced by intermixing molecules of different kinds. Picture an enormous checkerboard and imagine the possible rearrangements of the black and white squares, like a puzzle whose pieces slide around. Relax the condition that the differently colored squares are equal in number, increase the number of colors (one for each different chemical component), and generalize to three dimensions. For a gas, imagine that the white squares are empty spaces, and amplify their number many orders of magnitude.

Shannon's method
A shorter and more logically transparent method, not requiring require Stirling's approximation, is to use Shannon's definition for entropy in calculating the compositional uncertainty


 * $$ -k \sum_{i=1}^r p_i \ln \,p_i $$

We employ the same (real or) conceptual lattice, where


 * $$ p_i = N_i/N \,$$

is the probability that a molecule of $$i$$ is in any given lattice site, equal to the number of molecules of $$i$$, $$N_i$$, divided by the number of lattice sites, $$N$$. The summation is over all the chemical species present, so this is the uncertainty about which kind of molecule (if any) is in any one site. It must be multiplied by the total number of sites to get the spatial uncertainty for the whole system.

Condensed phases
A crystal has its own intrinsic lattice to specify molecular locations. It is completely ordered and, for a pure substance, has zero spatial entropy: the same kind of molecule occupies every site. Mixed crystals can be formed with isotopic substitutions and from closely related molecules. Although localization within a crystal lattice would seem to be a foregone conclusion, there are no perfect crystals; there are always dislocations and impurities.

For less ordered condensed phases we will impose an imaginary geometrical lattice to assign locations to the molecular centers of mass. We are ignoring the free volume and other factors due to greater molecular disorder in liquids and amorphous solids as compared to crystals. Free volume is reflected in the fact that a liquid is (usually) less dense than its own solid phase, with less certainty regarding the locations of the molecular centers of mass. An artificial lattice, and the assumption that different kinds of molecules have essentially the same size, are the geometric approximations we will use.

In each of two separate condensed phases, there is little spatial uncertainty. Everywhere we look in component $$1$$, one of its molecules is present, and likewise for component $$2$$. In a mixture of the two, the material is still dense with molecules, but now there is uncertainty about what kind of molecule we will find in any given location.

Let us proceed first by using the traditional Boltzmann formula. The simplest case is a mixture with only two components, $$1$$ and $$2$$. We are after the number of possible patterns or configurations achievable with $$N_1$$ molecules of $$1$$ and $$N_2$$ molecules of $$2$$ arranged on a lattice with $$N$$ sites. Since we are considering a condensed phase, the number of sites is equal to the total number of molecules.


 * $$N = N_1 + N_2 \,$$

The number of distinct configurations $$W$$ is given by the formula for the permutations of $$N$$ things subject to the condition that $$N_1$$ of them are identical, and likewise for $$N_2$$.


 * $$ W = \frac{N!}{N_1!N_2!} \,$$

Plugging this algebraic form into Boltzmann's equation and applying Stirling's approximation for the logarithms of factorials, the entropy of mixing turns out to be


 * $$ \Delta S_m = -k [\,N_1 \ln \,(N_1/N) + N_2 \ln \,(N_2/N)\,] \,$$

which has been written using the conventional $$\Delta S$$ notation ($$\Delta$$ denotes a change), suggesting that the mixture has been formed by a mixing process from two separate pure phases, each of which originally had no spatial uncertainty. (This is not necessarily the case; for example, the second component may have formed from the first one by a chemical reaction.) This expression can be generalized to a mixture of $$r$$ components, with $$i = 1,\mbox{ } 2,\mbox{ } 3, \mbox{ }... \mbox{ }r$$


 * $$ \Delta S_m =-k \sum_{i=1}^r N_i \ln \,(N_i/N) = -k \sum_{i=1}^r N_i \ln x_i \,\! $$

We have introduced the mole fractions, which are also the probabilities of finding any particular component in a given lattice site.


 * $$ x_i = p_i =  N_i/N \, $$

For the two-component case,


 * $$ \Delta S_m = - R\,(n_1 \ln x_1 + n_2 \ln x_2) = -nR\,(x_1 \ln x_1 + x_2 \ln x_2)\, $$

where $$R$$ is the gas constant, equal to $$k$$ times Avogadro's number, $$n_1$$ and $$n_2$$ are the numbers of moles of the components, and $$n$$ is the total number of moles. Since the mole fractions are necessarily less than one, the values of the logarithms are negative. The minus sign reverses this, giving a positive entropy of mixing, as expected.

Shannon's formula yields the desired result directly.


 * $$ -k \, \sum_{i=1}^r p_i \ln \,p_i $$

The summation is over the various chemical species, so this is the uncertainty about which kind of molecule is in any one site. It must be multiplied by the number of sites $$N$$ to get the uncertainty for the whole system. Doing this, and using the fact that $$p_i = N_i/N$$, we obtain


 * $$ \Delta S_m = -k N\sum_{i=1}^r (N_i/N) \ln \,(N_i/N)\,\! $$

which is the same as the result obtained using Boltzmann's formula. The two methods are essentially equivalent. (But see the Discussion.)

Solutions
If the solute is a crystalline solid, the argument is much the same. A crystal has no spatial uncertainty at all, except for crystallographic defects, and a (perfect) crystal allows us to localize the molecules using the crystal symmetry group. The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed phases. If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid.

The Flory-Huggins solution theory provides the entropy of mixing for polymer solutions, in which the macromolecules are huge compared to the solute molecules. In this case, the assumption is made that each monomer subunit in the polymer chain occupies a lattice site.

Note that solids in contact with each other also slowly interdiffuse, and solid mixtures of two or more components may be made at will (alloys, semiconductors, etc.). Again, the same equations for the entropy of mixing apply, but only for homogeneous, uniform phases.

Gases
In a gas there is a huge amount of spatial uncertainty because most of its volume is empty space. For a single-component gas, there is only one question: is a lattice site empty or does it contain the center of mass of a gas molecule? Almost everywhere we look there is nothing. It may come as a surprise to realize that we have already answered this question, since a gas may be regarded as a mixture of its molecules with another component: empty space, which plays the role of “solvent”. In a gas mixture, there is a second question which arises only for occupied sites: which kind of molecule is it?

In order to get the total entropy of a gas, we must also calculate the contingent uncertainty about the momentum of a molecule for each lattice site that is found to contain one. We obtain a Boltzmann distribution over energies and a partition function which depend on (and define) the temperature, and add this to the spatial uncertainty. But in regard to mixing, we are concerned only with spatial entropy.

For all materials of interest at room temperature, the thermal de Broglie wavelength is much less than intermolecular distances; in fact, it is less than actual molecular diameters. Under such conditions, classical mechanics is not merely adequate, but by far the superior the superior mathematical model. Heisenberg's uncertainty principle may be ignored, and we may treat the determination of position separately from translational momentum and kinetic energy. In fact, we may ignore translational kinetic energy altogether. The separation of position-momentum phase space into disjoint position and momentum "phase spaces" is specifically justified in the classical limit. Moreover, using the classical picture, we can talk about gas expanding from a corner of an enclosure to fill the entire volume. This has no sensible meaning in quantum mechanics, at least not using the Schrödinger time-independent wave equation.

If we have a pure gas consisting of $$N_1$$ molecules, we want to calculate the number of ways, or occupancy patterns, $$W$$ of arranging $$N_1$$ occupied sites and $$N_0$$ empty sites on a lattice with $$N$$ total sites.


 * $$N = N_1 + N_0 \,$$

and


 * $$ W = \frac{N!}{N_1! N_0!} \,$$

But we have just performed this calculation above, although with a different interpretation of $$N_0$$ = $$N_2$$. Clearly, the spatial uncertainty in gas entropy is just the entropy of mixing of gas molecules and empty space. For a pure gas, considering just the spatial uncertainty part of the entropy,


 * $$ S = -k [\,N_1 \ln \,(N_1/N) + N_0 \ln \,(N_0/N)\,] \approx -k N_1 \ln \,(N_1/N) $$

The simplification is possible because $$N_0/N$$ is just slightly less than one and its log is negligible; most of the space in a gas is empty lattice sites. Note that $$\rho_1 = (N_1/Nv) = (N_1/V)$$ is the molecular concentration, or number density, of the gas molecules, where $$v$$ is the volume of a single lattice site and $$V$$ is the total volume of the system. The reciprocal of this quantity is the volume per molecule, $$V/N_1$$. So long as this is large with respect to $$\Lambda^3$$, the cube of the thermal de Broglie wavelength, we can be sure that the "wave packets" for the molecules hardly ever touch, and the classical mechanical treatment is the appropriate one. For all real gases at room temperature, this condition is more than satisfied.

In the ideal gas approximation, which is pretty good for dilute gases at normal temperatures, volumes are additive for two samples of different gases combined at constant $$T$$ and $$P$$. In any case, let $$N_2$$ be the number of molecules of a second type of gas in a mixture. The spatial part of the entropy of the mixture is $$k$$ times the log of


 * $$ W = \frac{N!}{N_1! N_2! N_0!} \,$$

We can regard the mixing of two kinds of gas (at constant $$T$$ and $$P$$) as simply conjoining the two containers. The two lattices which allow us to conceptually localize molecular centers of mass also join. The total number of empty cells is the sum of the numbers of empty cells in the two components prior to mixing. Consequently, that part of the spatial uncertainty concerning whether any molecule is present in a lattice cell is the sum of the initial values, and does not increase upon mixing.

Almost everywhere we look, we find empty lattice sites. But for those few sites which are occupied, there is a contingent uncertainty about which kind of molecule it is. Using conditional probabilities, it turns out that the analytical problem for the small subset of occupied cells is exactly the same as for mixed liquids, and the increase in the entropy, or spatial uncertainty, has exactly the same form as obtained previously. Obviously the subset of occupied cells is not the same at different times.

See also: Gibbs Paradox, in which it would seem that mixing two samples of the same gas would produce entropy. The derivation given here avoids this "paradox", since if the molecules are all of the same kind, there is no entropy increase. Underlying this success is the fact that we are drawing a distinction between "identical" and "indistinguishable". Identical objects are distinguishable if they are in different places, even though they can not be intrinsically labeled. Identical objects can be treated as distinguishable if their wavefunctions do not sensibly overlap.

The entropy increase accompanying the free expansion of a gas into an evacuated space may be regarded as the entropy of mixing of a gas with a region of vacuum. For an ideal gas, there is no temperature change when this happens.

Discussion
The preceding analysis is only an approximation, except for dilute gases. It is not too bad for mixtures of denser gases, or for liquids or amorphous solids with molecules of about the same size. Likewise for crystalline mixtures. We have not considered intermolecular forces (energies). Mixing substances whose molecules cross-react differently than they do in their pure phases results in a (positive or negative) heat (or enthalpy) of mixing, in addition to considerations of entropy. We have ignored any correlations in the dispositions of neighboring molecules, including angular orientations due to molecular shapes, or due to any other geometrical or energetic reason, such as clouds of counter-ions surrounding charged colloidal particles. The fundamental assumption is that all occupancy patters, or spatial "microstates", are counted as equally likely. But biasing effects of near neighbor interactions could perhaps be incorporated into the theory.

It is desirable to maintain the form of the equations derived above, even if correction factors (activity coefficients) are required. Whenever possible, deviations from ideality are managed by multiplying mole fractions (or concentrations) by experimentally or theoretically determined activity coefficients to handle deviations from ideality for both entropy and energy. Mixing substances with gross dissymmetries in size requires a better mathematical model. For long-chain polymers, see the Flory-Huggins solution theory.

There is a tacit mathematical assumption involved in using the Shannon entropy which might have escaped notice, which makes it differ in an interesting way from the Boltzmann formula. If we "find" a molecule of type $$2$$ in the first location we examine, there are only $$N_2 - 1$$ molecules of $$2$$ left to be found in the $$N - 1$$ remaining lattice sites. That is, one site and one molecule of $$2$$ have each been "used up" and we should proceed only after taking that into account. This is possible but algebraically messy. However, it is not a problem in the thermodynamic limit of large systems, where we can regard our system as a smaller subsystem defined by geometrical "walls" through which molecules can pass. In this case, $$N_2 - 1$$ is a time-average value and not a rigid constraint. This is the idea behind Gibbs' grand canonical ensemble. But for systems of finite size, it is the original Boltzmann formulation for entropy in terms of factorials which is really correct, since it uses the actual particle numbers, making the presentation of the "long way" instructive. The use of Stirling's approximation also eliminates any mathematical distinction between the two ensembles, producing the same final results and making epistemological arguments about their inner meanings moot. For systems with a small number of particles, if it is possible to use the Boltzmann formula without Stirling's approximation, that would give the more accurate result. However, the idea that the canonical ensemble represents an external heat bath which maintains constant $$T$$ by maintaining an average internal energy still stands.