User:XaosBits/Sandbox

Statistical mechanics is a branch of physics that establishes the observable properties of matter from its atomic constituents. Statistical methods are used to characterize the mechanical properties of the large number of atomic constituents of matter. With the methods of statistical mechanics, the thermodynamic properties of gases can be derived, the heat capacity of solids, or whether, at a given temperature, iron will be magnetic or not.

Overview
The mechanical properties of atoms are responsible for the thermodynamic properties of matter: how fast atoms move is related to the temperature; how hard molecules of a gas collide with its container is related to the pressure the gas exerts. The pressure and the temperature emerge as the average effect of a large number of particles (1023) and their interactions.

In principle, it should be possible to compute the pressure or other thermodynamic quantities from the equations of motion describing the particles (by using Newton's or Schrödinger's equations). The equations of motion would need to be solved, starting points determined, and the effects of the trajectory of every particle combined to obtain the thermodynamic quantity. As the particles move, the pressure or other thermodynamic quantities should be expected to rise and fall, but if the effect of a large enough number of particles is averaged, these fluctuations should become very small. Statistical mechanics simplifies the calculation of the average by re-arranging its order: it assumes from the start that each different arrangement of position and speed of the particles will contribute differently to the final average and provides a method for computing the average without solving the equations of motion.

Computing a thermodynamic quantity using methods from statistical mechanics is simpler than a direct computation using the equations of motion. Nevertheless, it does not eliminate all difficulties. The mechanism for the explanation of transitions between states of matter remained a challenge to the discipline for many years, and as of the early 21st century, it is still a challenge to compute the properties of liquids from molecular interactions.

Thermodynamic quantities are quasi-static: at any point in time the system could remain in that thermodynamic state for as long as needed. These states correspond to configurations in the equilibrium state of the system. There are properties of matter, such as the conductivity of metals or the viscosity of a fluid, that are related to nonequilibrium states. Nonequilibrium statistical mechanics, still in a state of development, aims at computing those properties.

The methods of statistical mechanics provide the foundations for many different disciplines. Different techniques for solving the averaging problems formulated from statistical mechanics are used in condensed matter, astrophysics, biophysics, and physical chemistry among many disciplines.

Basics
In statistical mechanics, matter is seen as a mechanical system described by a Hamiltonian, with the mechanical system composed of many identical sub-systems. The mechanical system composed of a liter of gas isolated in a container would have a 10xx dimensional phase space and the identical sub-systems would be the gas molecules. When the pressure of the gas is measured, a number is being associated to the state of the mechanical system. As a point in phase space describes a mechanical system, there is a function that to each point in phase space determines its pressure.

Pressure is an example of an observable: a bounded functions that to each point of phase space associate a number. Pressure was understood as a type of force by Torricelli; a force capable of holding up a column of mercury Pressure was later understood as a force related to energy changes with volume in a gas. This is a thermodynamic (and macroscopic) characterizations of pressure. But Bernoulli gave a microscopic description of pressure as the average rate of momentum transfer to an area element of the contained. When the molecules of the gas collide with the container, they have their momentum direction changed, transferring momentum to the container wall, therefore exerting a force over the area of the wall. The two descriptions of pressure&mdash;the microscopic and the macroscopic&mdash;need not agree and have to be shown equivalent within the statistical mechanics formalism. Volume, kinetic energy per particle, and particle number are other examples of observables that have a microscopic description. With time, as the mechanical system evolves, observables are expected to fluctuate. The time averaged observable is a constant and a candidate for a thermodynamic quantity.

Except for the simplest of mechanical systems, computing the average of an observable is a daunting task. In statistical mechanics, the computation is simplified by replacing the time average by an average over the phase space, each point weighed by its probability of occurring. The probability is computed by dividing the phase space into small non-overlapping regions. Each of these regions determines a microstate of the mechanical system. The size of each cell is limited by imposing constraints on each degree of freedom of the mechanical system. If p and q are the momentum and coordinates of a degree of freedom, then within a cell their variations &Delta;p and &Delta;q should be limited by
 * $$ \Delta p \, \Delta q \geq h \,,$$

where h is the Planck constant. The cell c in phase space with n degrees of freedom will have a volume limited by
 * $$ \mathrm{vol}(c) \geq h^{n} \,.$$

In each small cell, observables and other mechanical properties change very little. This allows an energy &epsilon;&sigma; to be assigned to each cell c&sigma; of phase space. The index &sigma; is used to enumerate all the cells. Boltzmann assigned equal probabilities to every cell and used that assumption to compute thermodynamic properties. Extending Boltzmann's calculations, Gibbs observed that thermodynamic relations could be reproduced for thermodynamic systems by assigning a probability proportional to
 * $$ e^{-\beta \varepsilon_\sigma} $$

to each cell, a term now called the Boltzmann factor. The parameter &beta;, Gibbs identified as being proportional to the inverse of the temperature T:
 * $$ \beta = \frac{1}{k_{B} T} $$

The proportionality constant kB was later named the Boltzmann constant by Planck.

Gibbs approach to statistical mechanics calculations, the canonical ensemble, poses a difficulty. For an isolated mechanical system, energy is conserved, but Gibbs' method assigns non-zero probability to any energy. Boltzmann method, the microcanonical ensemble, assigns equal probability to the cells that are accessible from the initial conditions, and therefore have the same energy. For both ensembles to describe the same physical reality the resulting average observable must be same. The equality can be proven in the limit of infinite size systems known as the thermodynamic limit.

The thermodynamic limit raises challenges to the formalism of statistical mechanics:
 * No longer is one mechanical system being considered, but a series of mechanical systems indexed by the size of the system.
 * Limits need to be verified to exist, as quantities may diverge with the thermodynamic limit.
 * Several properties of the mechanical system need to grow with the thermodynamic limit (volume, particle number, energy, etc.), while keeping densities constant.

A sequence of mechanical systems can be defined by choosing a method to specify the interactions among its sub-units. If qk is the vector of position coordinates of sub-unit k and pk the associated momenta, then  the Hamiltonian HN for a system with N sub-units can be chosen as
 * $$ H_N = \sum_{0 \leq i < N} \frac{p_k^2}{2} + U_N(q_0, q_1, \ldots, q_{N-1}) \,.$$

More general choices are possible, but complicate the notation. The potential U is a symmetric function, so that the value of U would be the same for any permutation of the position vectors qk. The common case is for the potential U to be the a function of only the distance between the position vectors:
 * $$ U_N = \sum_{0 \leq i < j < N} \phi(\|q_i - q_j\|) \,.$$

For the thermodynamic limit to exist the function &phi; needs to fall off fast enough at infinity (with $$ 1/r^{3+e} $$ in three dimensions) and also &phi; cannot become too negative at very short distances. Systems for which these conditions do not hold are of physical interest, but need to be analyzed very carefully. This happens, for example, for gases that condense on the surface of their container.

From the Boltzmann factors the free energy of a system can be computed. First the partition function ZN is computed
 * $$ Z_N = \sum_{\sigma} e^{- \beta \varepsilon_\sigma} $$

Planck called this sum Zustandsumme, hence the Z and Fowler renamed it partition function but kept the Z. The logarithm of the partion function is related to the free energy per particle f by
 * $$  -\beta f = \lim_{N \rightarrow \infty} \frac{1}{N} \log Z_N $$

Partition function, e to the beta H, thermo limit, gibbs states, entropy, relation to thermo.

Phase transitions

 * Several equilibrium states
 * Cluster expansion failure
 * Ising model and Baxter revolution
 * RG success

Classes of models
As statistical mechanics develops, large classes of problems are worked upon, such as:


 * Kinetic theory of gases
 * Percolation
 * Spin glasses
 * Exactly solved models

Quantum statistical mechanics

 * Gibbs' paradox
 * Bosons and fermions.
 * Bose condensate
 * Superfluids and superconductors

Nonequilibrium statistical mechanics

 * Linear response Kubo
 * Thermostated systems
 * Chaoticity hypothesis
 * SRB measures