Electronic entropy

Electronic entropy is the entropy of a system attributable to electrons' probabilistic occupation of states. This entropy can take a number of forms. The first form can be termed a density of states based entropy. The Fermi–Dirac distribution implies that each eigenstate of a system, $i$, is occupied with a certain probability, $p_{i}$. As the entropy is given by a sum over the probabilities of occupation of those states, there is an entropy associated with the occupation of the various electronic states. In most molecular systems, the energy spacing between the highest occupied molecular orbital and the lowest unoccupied molecular orbital is usually large, and thus the probabilities associated with the occupation of the excited states are small. Therefore, the electronic entropy in molecular systems can safely be neglected. Electronic entropy is thus most relevant for the thermodynamics of condensed phases, where the density of states at the Fermi level can be quite large, and the electronic entropy can thus contribute substantially to thermodynamic behavior. A second form of electronic entropy can be attributed to the configurational entropy associated with localized electrons and holes. This entropy is similar in form to the configurational entropy associated with the mixing of atoms on a lattice.

Electronic entropy can substantially modify phase behavior, as in lithium-ion battery electrodes, high temperature superconductors, and some perovskites. It is also the driving force for the coupling of heat and charge transport in thermoelectric materials, via the Onsager reciprocal relations.

General Formulation
The entropy due to a set of states that can be either occupied with probability $$p_i$$ or empty with probability $$1-p_i$$ can be written as:
 * $$S=-k_{\rm B}\sum_i n_i [ p_i \ln p_i + (1-p_i) \ln( 1 - p_i ) ] $$,

where $k_{B}$ is Boltzmann constant.

For a continuously distributed set of states as a function of energy, such as the eigenstates in an electronic band structure, the above sum can be written as an integral over the possible energy values, rather than a sum. Switching from summing over individual states to integrating over energy levels, the entropy can be written as:


 * $$S=-k_{\rm B} \int n(E) \left [ p(E) \ln p(E) +(1- p(E)) \ln \left ( 1- p(E)\right ) \right ]dE $$

where $n(E)$ is the density of states of the solid. The probability of occupation of each eigenstate is given by the Fermi function, $f$:


 * $$p(E)=f=\frac{1}{e^{(E-E_{\rm F}) / k_{\rm B} T} + 1}$$

where $E_{F}$ is the Fermi energy and $T$ is the absolute temperature. One can then re-write the entropy as:
 * $$S=-k_{\rm B} \int n(E) \left [ f \ln f +(1- f) \ln \left ( 1- f \right ) \right ]dE $$

This is the general formulation of the density-of-states based electronic entropy.

Useful approximation
It is useful to recognize that the only states within ~$±k_{B}T$ of the Fermi level contribute significantly to the entropy. Other states are either fully occupied, $f = 1$, or completely unoccupied, $f = 0$. In either case, these states do not contribute to the entropy. If one assumes that the density of states is constant within $±k_{B}T$ of the Fermi level, one can derive that the electron heat capacity, equal to:


 * $$C_V=T\left(\frac{\partial S}{\partial T}\right)_{T,V}=\frac{\pi^2}{3} k_{\rm B}^2 T n(E_{\rm F})$$

where $n(E_{F})$ is the density of states (number of levels per unit energy) at the Fermi level. Several other approximations can be made, but they all indicate that the electronic entropy should, to first order, be proportional to the temperature and the density of states at the Fermi level. As the density of states at the Fermi level varies widely between systems, this approximation is a reasonable heuristic for inferring when it may be necessary to include electronic entropy in the thermodynamic description of a system; only systems with large densities of states at the Fermi level should exhibit non-negligible electronic entropy (where large may be approximately defined as $n(E_{F}) ≥ (k2 BT)^{−1}$).

Application to different materials classes
Insulators have zero density of states at the Fermi level due to their band gaps. Thus, the density of states-based electronic entropy is essentially zero in these systems.

Metals have non-zero density of states at the Fermi level. Metals with free-electron-like band structures (e.g. alkali metals, alkaline earth metals, Cu, and Al) generally exhibit relatively low density of states at the Fermi level, and therefore exhibit fairly low electronic entropies. Transition metals, wherein the flat d-bands lie close to the Fermi level, generally exhibit much larger electronic entropies than the free-electron like metals.

Oxides have particularly flat band structures and thus can exhibit large $n(E_{F})$, if the Fermi level intersects these bands. As most oxides are insulators, this is generally not the case. However, when oxides are metallic (i.e. the Fermi level lies within an unfilled, flat set of bands), oxides exhibit some of the largest electronic entropies of any material.

Thermoelectric materials are specifically engineered to have large electronic entropies. The thermoelectric effect relies on charge carriers exhibiting large entropies, as the driving force to establish a gradient in electrical potential is driven by the entropy associated with the charge carriers. In the thermoelectric literature, the term band structure engineering refers to the manipulation of material structure and chemistry to achieve a high density of states near the Fermi level. More specifically, thermoelectric materials are intentionally doped to exhibit only partially filled bands at the Fermi level, resulting in high electronic entropies. Instead of engineering band filling, one may also engineer the shape of the band structure itself via introduction of nanostructures or quantum wells to the materials.

Configurational electronic entropy
Configurational electronic entropy is usually observed in mixed-valence transition metal oxides, as the charges in these systems are both localized (the system is ionic), and capable of changing (due to the mixed valency). To a first approximation (i.e. assuming that the charges are distributed randomly), the molar configurational electronic entropy is given by:
 * $$S \approx n_\text{sites} \left [ x \ln x + (1-x) \ln (1-x) \right ] $$

where $n_{sites}$ is the fraction of sites on which a localized electron/hole could reside (typically a transition metal site), and $x$ is the concentration of localized electrons/holes. Of course, the localized charges are not distributed randomly, as the charges will interact electrostatically with one another, and so the above formula should only be regarded as an approximation to the configurational atomic entropy. More sophisticated approximations have been made in the literature.