User:Patrick0Moran/vault

Preserve version that looks like it will be deleted



Quantum mechanics is the body of scientific principles that explains the behaviour of matter and its interactions with energy on the scale of atoms and subatomic particles and how these phenomena could be related to everyday life (see: Schrodinger's cat).

Classical physics explains matter and energy at the macroscopic level of the scale familiar to human experience, including the behaviour of astronomical bodies. It remains the key to measurement for much of modern science and technology. On the other hand, at the end of the 19th century scientists discovered phenomena in both the large (macro) and the small (micro) worlds that classical physics could not explain. Coming to terms with these limitations led to the development of quantum mechanics, a major revolution in physics. This article describes how physicists discovered the limitations of classical physics and developed the main concepts of the quantum theory that replaced them in the early decades of the 20th century. These concepts are described in roughly the order they were first discovered; for a more complete history of the subject, see History of quantum mechanics.

Some aspects of quantum mechanics can seem counter-intuitive or even paradoxical, because they describe behaviour quite different than that seen at larger length scales, where classical physics is an excellent approximation. In the words of Richard Feynman, quantum mechanics deals with "nature as She is – absurd."

Many types of energy, such as photons (discrete units of light), behave in some respects like particles and in other respects like waves. Radiators of photons (such as neon lights) have emission spectra that are discontinuous, in that only certain frequencies of light are present. Quantum mechanics predicts the energies, the colours, and the spectral intensities of all forms of electromagnetic radiation.

Quantum mechanics ordains that the more closely one pins down one measurement (such as the position of a particle), the less precise another measurement pertaining to the same particle (such as its momentum) must become. This is called the uncertainty principle, also known as the Heisenberg principle after the person who first proposed it.

Even more disconcerting, pairs of particles can be created as "entangled twins." As is described in more detail in the article on Quantum entanglement, entangled particles seem to exhibit what Einstein called "spooky action at a distance," matches between states that classical physics would insist must be random even when distance and the speed of light ensure that no physical causation could account for these correlations.

The first quantum theory: Max Planck and black body radiation
Thermal radiation is electromagnetic radiation emitted from the surface of an object due to the object's temperature. If an object is heated sufficiently, it starts to emit light at the red end of the spectrum – it is red hot. Heating it further causes the colour to change from red to yellow to white to blue, as light at shorter wavelengths (higher frequencies) begins to be emitted. It turns out that a perfect emitter is also a perfect absorber. When it is cold, such an object looks perfectly black, because it absorbs all the light that falls on it and emits none. Consequently, an ideal thermal emitter is known as a black body, and the radiation it emits is called black body radiation.

In the late 19th century, thermal radiation had been fairly well-characterized experimentally. How the wavelength at which the radiation is strongest changes with temperature is given by Wien's displacement law, and the overall power emitted per unit area is given by the Stefan–Boltzmann law. However, classical physics was unable to explain the relationship between temperatures and predominant frequencies of radiation. In fact, at short wavelengths, classical physics predicted that energy will be emitted by a hot body at an infinite rate. This result, which is clearly wrong, is known as the ultraviolet catastrophe. Physicists were searching for a single theory that explained why they got the experimental results that they did.

The first model that was able to explain the full spectrum of thermal radiation was put forward by Max Planck in 1900. He modeled the thermal radiation as being in equilibrium, using a set of harmonic oscillators. To reproduce the experimental results he had to assume that each oscillator produced an integer number of units of energy at its single characteristic frequency, rather than being able to emit any arbitrary amount of energy. In other words, the energy of each oscillator was "quantized." The quantum of energy for each oscillator, according to Planck, was proportional to the frequency of the oscillator; the constant of proportionality is now known as the Planck constant. The Planck constant, usually written as $h$, has the value $6.63 J s$, and so the energy $E$ of an oscillator of frequency $f$ is given by
 * $$E = nhf,\quad \text{where}\quad n = 1,2,3,\ldots$$

Planck's law was the first quantum theory in physics, and Planck won the Nobel Prize in 1918 "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta." At the time, however, Planck's view was that quantization was purely a mathematical trick, rather than (as we now know) a fundamental change in our understanding of the world.

Photons: the quantisation of light
In 1905, Albert Einstein took an extra step. He suggested that quantisation was not just a mathematical trick: the energy in a beam of light occurs in individual packets, which are now called photons. The energy of a single photon is given by its frequency multiplied by Planck's constant:
 * $$E = hf.$$

For centuries, scientists had debated between two possible theories of light: was it a wave or did it instead comprise a stream of tiny particles? By the 19th century, the debate was generally considered to have been settled in favour of the wave theory, as it was able to explain observed effects such as refraction, diffraction and polarization. James Clerk Maxwell had shown that electricity, magnetism and light are all manifestations of the same phenomenon: the electromagnetic field. Maxwell's equations, which are the complete set of laws of classical electromagnetism, describe light as waves: a combination of oscillating electric and magnetic fields. Because of the preponderance of evidence in favour of the wave theory, Einstein's ideas were met initially with great skepticism. Eventually, however, the photon model became favoured; one of the most significant pieces of evidence in its favour was its ability to explain several puzzling properties of the photoelectric effect, described in the following section. Nonetheless, the wave analogy remained indispensable for helping to understand other characteristics of light, such as diffraction.

The photoelectric effect


In 1887 Heinrich Hertz observed that light can eject electrons from metal. In 1902 Philipp Lenard discovered that the maximum possible energy of an ejected electron is related to the frequency of the light, not to its intensity; if the frequency is too low, no electrons are ejected regardless of the intensity. The lowest frequency of light that causes electrons to be emitted, called the threshold frequency, is different for every metal. This observation is at odds with classical electromagnetism, which predicts that the electron's energy should be proportional to the intensity of the radiation.

Einstein explained the effect by postulating that a beam of light is a stream of particles (photons), and that if the beam is of frequency $f$ then each photon has an energy equal to $hf$. An electron is likely to be struck only by a single photon, which imparts at most an energy $hf$ to the electron. Therefore, the intensity of the beam has no effect; only its frequency determines the maximum energy that can be imparted to the electron.

To explain the threshold effect, Einstein argued that it takes a certain amount of energy, called the work function, denoted by $φ$, to remove an electron from the metal. This amount of energy is different for each metal. If the energy of the photon is less than the work function then it does not carry sufficient energy to remove the electron from the metal. The threshold frequency, $f_{0}$, is the frequency of a photon whose energy is equal to the work function:
 * $$\varphi = h f_0.$$

If $f$ is greater than $f_{0}$, the energy $hf$ is enough to remove an electron. The ejected electron has a kinetic energy $E_{K}$ which is, at most, equal to the photon's energy minus the energy needed to dislodge the electron from the metal:
 * $$E_K = hf - \varphi = h(f - f_0).$$

Einstein's description of light as being composed of particles extended Planck's notion of quantised energy: a single photon of a given frequency $f$ delivers an invariant amount of energy $hf$. In other words, individual photons can deliver more or less energy, but only depending on their frequencies. However, although the photon is a particle it was still being described as having the wave-like property of frequency. Once again, the particle account of light was being "compromised".

The relationship between the frequency of electromagnetic radiation and the energy of each individual photon is why ultraviolet light can cause sunburn, but visible or infrared light cannot. A photon of ultraviolet light will deliver a high amount of energy – enough to contribute to cellular damage such as occurs in a sunburn. A photon of infrared light will deliver a lower amount of energy – only enough to warm one's skin. So an infrared lamp can warm a large surface, perhaps large enough to keep people comfortable in a cold room, but it cannot give anyone a sunburn.

If each individual photon had identical energy, it would not be correct to talk of a "high energy" photon. Light of high frequency could carry more energy only because of flooding a surface with more photons arriving per second. Light of low frequency could carry more energy only for the same reason. If it were true that all photons carry the same energy, then if you doubled the rate of photon delivery, you would double the number of energy units arriving each second. Einstein rejected that wave-dependent classical approach in favour of a particle-based analysis where the energy of the particle must be absolute and varies with frequency in discrete steps (i.e. is quantised). All photons of the same frequency have identical energy, and all photons of different frequencies have proportionally different energies.

In nature, single photons are rarely encountered. The sun emits photons continuously at all electromagnetic frequencies, so they appear to propagate as a continuous wave, not as discrete units. The emission sources available to Hertz and Lennard in the 19th century shared that characteristic. A sun that radiates red light, or a piece of iron in a forge that glows red, may both be said to contain a great deal of energy. It might be surmised that adding continuously to the total energy of some radiating body would make it radiate red light, orange light, yellow light, green light, blue light, violet light, and so on in that order. But that is not so, as larger suns and larger pieces of iron in a forge would then necessarily glow with colours more toward the violet end of the spectrum. To change the colour of such a radiating body it is necessary to change its temperature. An increase in temperature changes the quanta of energy available to excite individual atoms to higher levels, enabling them to emit photons of higher frequencies.

The total energy emitted per unit of time by a sun (or by a piece of iron in a forge) depends on both the number of photons emitted per unit of time, as well as the amount of energy carried by each of the photons involved. In other words, the characteristic frequency of a radiating body is dependent on its temperature. When physicists were looking only at beams of light containing huge numbers of individual and virtually indistinguishable photons, it was difficult to understand the importance of the energy levels of individual photons. So when physicists first discovered devices exhibiting the photoelectric effect, they initially expected that a higher intensity of light would produce a higher voltage from the photoelectric device. Conversely, they discovered that strong beams of light toward the red end of the spectrum might produce no electrical potential at all, and that weak beams of light toward the violet end of the spectrum would produce higher and higher voltages. Einstein's idea that individual units of light may contain different amounts of energy, depending on their frequency, made it possible to explain such experimental results that had hitherto seemed quite counter-intuitive.

Although the energy imparted by photons is invariant at any given frequency, the initial energy state of the electrons in a photoelectric device prior to absorption of light is not necessarily uniform. Anomalous results may occur in the case of individual electrons. For instance, an electron that was already excited above the equilibrium level of the photoelectric device might be ejected when it absorbed uncharacteristically low frequency illumination. Statistically, however, the characteristic behaviour of a photoelectric device will reflect the behaviour of the vast majority of its electrons, which will be at their equilibrium level. This point is helpful in comprehending the distinction between the study of individual particles in quantum dynamics and the study of massed particles in classical physics.

The quantisation of matter: the Bohr model of the atom
By the dawn of the 20th century, evidence required a model of the atom with a diffuse cloud of negatively-charged electrons surrounding a small, dense, positively-charged nucleus. These properties suggested a model in which the electrons circle around the nucleus like planets orbiting a sun. However, it was also known that the atom in this model would be unstable: according to classical theory orbiting electrons are undergoing centripetal acceleration, and should therefore give off electromagnetic radiation, the loss of energy also causing them to spiral toward the nucleus, colliding with it in a fraction of a second.

A second, related, puzzle was the emission spectrum of atoms. When a gas is heated, it gives off light only at discrete frequencies. For example, the visible light given off by hydrogen consists of four different colours, as shown in the picture below. By contrast, white light consists of a continuous emission across the whole range of visible frequencies.

In 1885 the Swiss mathematician Johann Balmer discovered that each wavelength $λ$ (lambda) in the visible spectrum of hydrogen is related to some integer $n$ by the equation
 * $$\lambda = B\left(\frac{n^2}{n^2-4}\right) \qquad\qquad n = 3,4,5,6$$

where $B$ is a constant which Balmer determined to be equal to 364.56 nm. Thus Balmer's constant was the basis of a system of discrete, i.e. quantised, integers.

In 1888 Johannes Rydberg generalized and greatly increased the explanatory utility of Balmer's formula. He predicted that $λ$ is related to two integers $n$ and $m$ according to what is now known as the Rydberg formula:
 * $$ \frac{1}{\lambda} = R \left(\frac{1}{m^2} - \frac{1}{n^2}\right),$$

where R is the Rydberg constant, equal to 0.0110 nm−1, and n must be greater than m.

Rydberg's formula accounts for the four visible wavelengths of hydrogen by setting $m = 2$ and $n = 3, 4, 5, 6$. It also predicts additional wavelengths in the emission spectrum: for $m = 1$ and for $n &gt; 1$, the emission spectrum should contain certain ultraviolet wavelengths, and for $m = 3$ and $n &gt; 3$, it should also contain certain infrared wavelengths. Experimental observation of these wavelengths came two decades later: in 1908 Louis Paschen found some of the predicted infrared wavelengths, and in 1914 Theodore Lyman found some of the predicted ultraviolet wavelengths.

Bohr's model


In 1913 Niels Bohr proposed a new model of the atom that included quantized electron orbits. In Bohr's model, electrons could inhabit only certain orbits around the atomic nucleus. When an atom emitted (or absorbed) energy, the electron did not move in a continuous trajectory from one orbit around the nucleus to another, as might be expected classically. Instead, the electron would jump instantaneously from one orbit to another, giving off the emitted light in the form of a photon. The possible energies of photons given off by each element were determined by the differences in energy between the orbits, and so the emission spectrum for each element would contain a number of lines.

Bohr theorised that the angular momentum, $n = 1$, of an electron is quantised:
 * $$L = n\frac{h}{2\pi},$$

where $L$ is an integer and $n$ is the Planck constant. Starting from this assumption, Coulomb's law and the equations of circular motion show that an electron with $h$ units of angular momentum will orbit a proton at a distance $n$ given by
 * $$r = \frac{n^2 h^2}{4 \pi^2 k_e m e^2}$$,

where $r$ is the Coulomb constant, $k_{e}$ is the mass of an electron, and $m$ is the charge on an electron. For simplicity this is written as
 * $$r = n^2 a_0,\!$$

where $e$, called the Bohr radius, is equal to 0.0529 nm. The Bohr radius is the radius of the smallest allowed orbit.

The energy of the electron can also be calculated, and is given by
 * $$E = -\frac{k_{\mathrm{e}}e^2}{2a_0} \frac{1}{n^2}$$.

Thus Bohr's assumption that angular momentum is quantised means that an electron can only inhabit certain orbits around the nucleus, and that it can have only certain energies. A consequence of these constraints is that the electron will not crash into the nucleus: it cannot continuously emit energy, and it cannot come closer to the nucleus than a0 (the Bohr radius).

An electron loses energy by jumping instantaneously from its original orbit to a lower orbit; the extra energy is emitted in the form of a photon. Conversely, an electron that absorbs a photon gains energy, hence it jumps to an orbit that is farther from the nucleus.

Each photon from glowing atomic hydrogen is due to an electron moving from a higher orbit, with radius $a_{0}$, to a lower orbit, $r_{n}$. The energy $r_{m}$ of this photon is the difference in the energies $E_{γ}$ and $E_{n}$ of the electron:
 * $$E_{\gamma} = E_n - E_m = \frac{k_{\mathrm{e}}e^2}{2a_0}\left(\frac{1}{m^2}-\frac{1}{n^2}\right)$$

Since Planck's equation shows that the photon's energy is related to its wavelength by $E_{m}$, the wavelengths of light that can be emitted are given by
 * $$\frac{1}{\lambda} = \frac{k_{\mathrm{e}}e^2}{2 a_0 h c}\left(\frac{1}{m^2}-\frac{1}{n^2}\right).$$

This equation has the same form as the Rydberg formula, and predicts that the constant $E_{γ} = hc/λ$ should be given by
 * $$R = \frac{k_{\mathrm{e}}e^2}{2 a_0 h c} .$$

Therefore the Bohr model of the atom can predict the emission spectrum of hydrogen in terms of fundamental constants. However, it was not able to make accurate predictions for multi-electron atoms, or to explain why some spectral lines are brighter than others.

Wave–particle duality
In 1924, Louis de Broglie proposed the idea that just as light has both wave-like and particle-like properties, matter also has wave-like properties. The wavelength,  λ , associated with a particle is related to its momentum,  p  through the Planck constant  h :
 * $$ p = \frac{h}{\lambda}.$$

The relationship, called the de Broglie hypothesis, holds for all types of matter. Thus all matter exhibits properties of both particles and waves.

Three years later, the wave-like nature of electrons was demonstrated by showing that a beam of electrons could exhibit diffraction, just like a beam of light. At the University of Aberdeen, George Thomson passed a beam of electrons through a thin metal film and observed the predicted diffraction patterns. At Bell Labs, Davisson and Germer guided their beam through a crystalline grid. Similar wave-like phenomena were later shown for atoms and even small molecules. De Broglie was awarded the Nobel Prize for Physics in 1929 for his hypothesis; Thomson and Davisson shared the Nobel Prize for Physics in 1937 for their experimental work.

The concept of wave–particle duality says that neither the classical concept of "particle" nor of "wave" can fully describe the behaviour of quantum-scale objects, either photons or matter. Indeed, astrophysicist A.S. Eddington proposed in 1927 that "We can scarcely describe such an entity as a wave or as a particle; perhaps as a compromise we had better call it a 'wavicle' ". (This term was later popularised by mathematician Banesh Hoffmann.) Wave–particle duality is an example of the principle of complementarity in quantum physics. An elegant example of wave–particle duality, the double slit experiment, is discussed in the section below.

De Broglie's treatment of quantum events served as a jumping off point for Schrödinger when he set about to construct a wave equation to describe quantum theoretical events.

The double-slit experiment
In the double-slit experiment as originally performed by Thomas Young and Augustin Fresnel in 1827, a beam of light is directed through two narrow, closely spaced slits, producing an interference pattern of light and dark bands on a screen. If one of the slits is covered up, one might naively expect that the intensity of the fringes due to interference would be halved everywhere. In fact, a much simpler pattern is seen, a simple diffraction pattern. Closing one slit results in a much simpler pattern diametrically opposite the open slit. Exactly the same behaviour can be demonstrated in water waves, and so the double-slit experiment was seen as a demonstration of the wave nature of light.

The double-slit experiment has also been performed using electrons, atoms, and even molecules, and the same type of interference pattern is seen. Thus it has been demonstrated that all matter possesses both particle and wave characteristics.

Even if the source intensity is turned down so that only one particle (e.g. photon or electron) is passing through the apparatus at a time, the same interference pattern develops over time. The quantum particle acts as a wave when passing through the double slits, but as a particle when it is detected. This is a typical feature of quantum complementarity: a quantum particle will act as a wave when we do an experiment to measure its wave-like properties, and like a particle when we do an experiment to measure its particle-like properties. Where on the detector screen any individual particle shows up will be the result of an entirely random process.

Application to the Bohr model
De Broglie expanded the Bohr model of the atom by showing that an electron in orbit around a nucleus could be thought of as having wave-like properties. In particular, an electron will be observed only in situations that permit a standing wave around a nucleus. An example of a standing wave is a violin string, which is fixed at both ends and can be made to vibrate. The waves created by a stringed instrument appear to oscillate in place, moving from crest to trough in an up-and-down motion. The wavelength of a standing wave is related to the length of the vibrating object and the boundary conditions. For example, because the violin string is fixed at both ends, it can carry standing waves of wavelengths 2l/n, where l is the length and n is a positive integer. De Broglie suggested that the allowed electron orbits were those for which the circumference of the orbit would be an integer number of wavelengths.

Development of modern quantum mechanics
In 1925, building on de Broglie's hypothesis, Erwin Schrödinger developed the equation that describes the behaviour of a quantum mechanical wave. The equation, called the Schrödinger equation after its creator, is central to quantum mechanics, defines the permitted stationary states of a quantum system, and describes how the quantum state of a physical system changes in time. In the paper that introduced Schrödinger's cat, he says that the psi-function featured in his equation provides the "means for predicting probability of measurement results," and that it therefore provides "future expectation[s], somewhat as laid down in a catalog."

Schrödinger was able to calculate the energy levels of hydrogen by treating a hydrogen atom's electron as a classical wave, moving in a well of electrical potential created by the proton. This calculation accurately reproduced the energy levels of the Bohr model.

At a somewhat earlier time, Werner Heisenberg was trying to find an explanation for the intensities of the different lines in the hydrogen emission spectrum. By means of a series of mathematical analogies, Heisenberg wrote out the quantum mechanical analogue for the classical computation of intensities. Shortly afterwards, Heisenberg's colleague Max Born realised that Heisenberg's method of calculating the probabilities for transitions between the different energy levels could best be expressed by using the mathematical concept of matrices.

In May 1926, Schrödinger proved that Heisenberg's matrix mechanics and his own wave mechanics made the same predictions about the properties and behaviour of the electron; mathematically, the two theories were identical. Yet the two men disagreed on the interpretation of their mutual theory. For instance, Heisenberg saw no problem in the theoretical prediction of instantaneous transitions of electrons between orbits in an atom, but Schrödinger hoped that a theory based on continuous wave-like properties could avoid what he called (in the words of Wilhelm Wien ) "this nonsense about quantum jumps."

Copenhagen interpretation
Bohr, Heisenberg and others tried to explain what these experimental results and mathematical models really mean. Their description, known as the Copenhagen interpretation of quantum mechanics, aimed to describe the nature of reality that was being probed by the measurements and described by the mathematical formulations of quantum mechanics.

The main principles of the Copenhagen interpretation are:
 * 1) A system is completely described by a wave function, $$\psi$$. (Heisenberg)
 * 2) How $$\psi$$ changes over time is given by the Schrödinger equation.
 * 3) The description of nature is essentially probabilistic. The probability of an event – for example, where on the screen a particle will show up in the two slit experiment – is related to the square of the absolute value of the amplitude of its wave function. (Born rule, due to Max Born, which gives a physical meaning to the wavefunction in the Copenhagen interpretation: the probability amplitude)
 * 4) It is not possible to know the values of all of the properties of the system at the same time; those properties that are not known with precision must be described by probabilities. (Heisenberg's uncertainty principle)
 * 5) Matter, like energy, exhibits a wave–particle duality. An experiment can demonstrate the particle-like properties of matter, or its wave-like properties; but not both at the same time. (Complementarity principle due to Bohr)
 * 6) Measuring devices are essentially classical devices, and measure classical properties such as position and momentum.
 * 7) The quantum mechanical description of large systems should closely approximate the classical description. (Correspondence principle of Bohr and Heisenberg)

Various consequences of these principles are discussed in more detail in the following subsections.

Uncertainty principle
Suppose that we want to measure the position and speed of an object – for example a car going through a radar speed trap. Naively, we assume that the car has a definite position and speed at a particular moment in time, and how accurately we can measure these values depends on the quality of our measuring equipment – if we improve the precision of our measuring equipment, we will get a result that is closer to the true value. In particular, we would assume that how precisely we measure the speed of the car does not affect the measurement of its position, and vice versa.

In 1927, Heisenberg proved that these assumptions are not correct. Quantum mechanics shows that certain pairs of physical properties, like position and speed, cannot both be known to arbitrary precision: the more precisely one property is known, the less precisely the other can be known. This statement is known as the uncertainty principle. The uncertainty principle isn't a statement about the accuracy of our measuring equipment, but about the nature of the system itself – our naive assumption that the car had a definite position and speed was incorrect. On a scale of cars and people, these uncertainties are too small to notice, but when dealing with atoms and electrons they become critical.

Heisenberg gave, as an illustration, the measurement of the position and momentum of an electron using a photon of light. In measuring the electron's position, the higher the frequency of the photon the more accurate is the measurement of the position of the impact, but the greater is the disturbance of the electron, which absorbs a random amount of energy, rendering the measurement obtained of its momentum increasingly uncertain (momentum is velocity multiplied by mass), for one is necessarily measuring its post-impact disturbed momentum, from the collision products, not its original momentum. With a photon of lower frequency the disturbance – hence uncertainty – in the momentum is less, but so is the accuracy of the measurement of the position of the impact.

The uncertainty principle shows mathematically that the product of the uncertainty in the position and momentum of a particle (momentum is velocity multiplied by mass) could never be less than a certain value, and that this value is related to Planck's constant.

Wave function collapse
Wave function collapse is a forced expression for whatever just happened when it becomes appropriate to replace the description of an uncertain state of a system by a description of the system in a definite state. Explanations for the nature of the process of becoming certain are controversial. At any time before a photon "shows up" on a detection screen it can only be described by a set of probabilities for where it might show up. When it does show up, for instance in the CCD of an electronic camera, the time and the space where it interacted with the device are known within very tight limits. However, the photon has disappeared, and the wave function has disappeared with it. In its place some physical change in the detection screen has appeared, e.g., an exposed spot in a sheet of photographic film, or a change in electric potential in some cell of a CCD.

Eigenstates and eigenvalues

 * For a more detailed introduction to this subject, see: Introduction to eigenstates

Because of the uncertainty principle, statements about both the position and momentum of particles can only assign a probability that the position or momentum will have some numerical value. Therefore it is necessary to formulate clearly the difference between the state of something that is indeterminate, such as an electron in a probability cloud, and the state of something having a definite value. When an object can definitely be "pinned-down" in some respect, it is said to possess an eigenstate.

The Pauli exclusion principle
In 1924, Wolfgang Pauli proposed a new quantum degree of freedom (or quantum number), with two possible values, to resolve inconsistencies between observed molecular spectra and the predictions of quantum mechanics. In particular, the spectrum of atomic hydrogen had a doublet, or pair of lines differing by a small amount, where only one line was expected. Pauli formulated his exclusion principle, stating that "There cannot exist an atom in such a quantum state that two electrons within [it] have the same set of quantum numbers."

A year later, Uhlenbeck and Goudsmit identified Pauli's new degree of freedom with a property called spin. The idea, originating with Ralph Kronig, was that electrons behave as if they rotate, or "spin", about an axis. Spin would account for the missing magnetic moment, and allow two electrons in the same orbital to occupy distinct quantum states if they "spun" in opposite directions, thus satisfying the exclusion principle. The quantum number represented the sense (positive or negative) of spin.

Application to the hydrogen atom
Bohr's model of the atom was essentially two-dimensional – an electron orbiting in a plane around its nuclear "sun." However, the uncertainty principle states that an electron cannot be viewed as having an exact location at any given time. In the modern theory the orbit has been replaced by an atomic orbital, a "cloud" of possible locations. It is often depicted as a three-dimensional region within which there is a 95 percent probability of finding the electron.

Schrödinger was able to calculate the energy levels of hydrogen by treating a hydrogen atom's electron as a wave, represented by the "wave function" $R$, in an electric potential well, $Ψ$, created by the proton. The solutions to Schrödinger's equation are distributions of probabilities for electron positions and locations. Orbitals have a range of different shapes in three dimensions. The energies of the different orbitals can be calculated, and they accurately reproduce the energy levels of the Bohr model.

Within Schrödinger's picture, each electron has four properties:
 * 1) An "orbital" designation, indicating whether the particle wave is one that is closer to the nucleus with less energy or one that is farther from the nucleus with more energy;
 * 2) The "shape" of the orbital, spherical or otherwise;
 * 3) The "inclination" of the orbital, determining the magnetic moment of the orbital around the $V$-axis.
 * 4) The "spin" of the electron.

The collective name for these properties is the quantum state of the electron. The quantum state can be described by giving a number to each of these properties; these are known as the electron's quantum numbers. The quantum state of the electron is described by its wavefunction. The Pauli exclusion principle demands that no two electrons within an atom may have the same values of all four numbers.

The first property describing the orbital is the principal quantum number, $z$, which is the same as in Bohr's model. $n$ denotes the energy level of each orbital. The possible values for $n$ are integers:
 * $$n = 1, 2, 3\ldots$$

The next quantum number, the azimuthal quantum number, denoted $n$, describes the shape of the orbital. The shape is a consequence of the angular momentum of the orbital. The angular momentum represents the resistance of a spinning object to speeding up or slowing down under the influence of external force. The azimuthal quantum number represents the orbital angular momentum of an electron around its nucleus. The possible values for $l$ are integers from 0 to $l$:
 * $$l = 0, 1, \ldots, n-1.$$

The shape of each orbital has its own letter as well. The first shape is denoted by the letter $n − 1$ (a mnemonic being "sphere"). The next shape is denoted by the letter $s$ and has the form of a dumbbell. The other orbitals have more complicated shapes (see atomic orbital), and are denoted by the letters $p$, $d$, and $f$.

The third quantum number, the magnetic quantum number, describes the magnetic moment of the electron, and is denoted by $g$ (or simply m). The possible values for $m_{l}$ are integers from $m_{l}$ to $−l$:
 * $$m_l = -l, -(l-1), \ldots, 0, 1, \ldots, l.$$

The magnetic quantum number measures the component of the angular momentum in a particular direction. The choice of direction is arbitrary, conventionally the z-direction is chosen.

The fourth quantum number, the spin quantum number (pertaining to the "orientation" of the electron's spin) is denoted $l$, with values +$1/2$ or −$1/2$.

The chemist Linus Pauling wrote, by way of example: "In the case of a helium atom with two electrons in the 1s orbital, the Pauli Exclusion Principle requires that the two electrons differ in the value of one quantum number. Their values of $m_{s}$, $n$, and $l$ are the same; moreover, they have the same spin, $m_{l}$. Accordingly they must differ in the value of $s = 1/2$, which can have the value of +$1/2$ for one electron and −$1/2$ for the other.""

It is the underlying structure and symmetry of atomic orbitals, and the way that electrons fill them, that determines the organisation of the periodic table and the structure and strength of chemical bonds between atoms.

Dirac wave equation
In 1928, Paul Dirac extended the Pauli equation, which described spinning electrons, to account for special relativity. The result was a theory that dealt properly with events, such as the speed at which an electron orbits the nucleus, occurring at a substantial fraction of the speed of light. By using the simplest electromagnetic interaction, Dirac was able to predict the value of the magnetic moment associated with the electron's spin, and found the experimentally observed value, which was too large to be that of a spinning charged sphere governed by classical physics. He was able to solve for the spectral lines of the hydrogen atom, and to reproduce from physical first principles Sommerfeld's successful formula for the fine structure of the hydrogen spectrum.

Dirac's equations sometimes yielded a negative value for energy, for which he proposed a novel solution: he posited the existence of an antielectron and of a dynamical vacuum. This led to the many-particle quantum field theory.

Quantum entanglement


The Pauli exclusion principle says that two electrons in one system cannot be in the same state. Nature leaves open the possibility, however, that two electrons can have both states "superimposed" over each of them. Recall that the wave functions that emerge simultaneously from the double slits arrive at the detection screen in a state of superposition. Nothing is certain until the superimposed waveforms "collapse," At that instant an electron shows up somewhere in accordance with the probability that is the square of the absolute value of the sum of the complex-valued amplitudes of the two superimposed waveforms. The situation there is already very abstract. A concrete way of thinking about entangled photons, photons in which two contrary states are superimposed on each of them in the same event, is as follows:

Imagine that the superposition of a state that can be mentally labeled as blue and another state that can be mentally labeled as red will then appear (in imagination, of course) as a purple state. Two photons are produced as the result of the same atomic event. Perhaps they are produced by the excitation of a crystal that characteristically absorbs a photon of a certain frequency and emits two photons of half the original frequency. So the two photons come out "purple." If the experimenter now performs some experiment that will determine whether one of the photons is either blue or red, then that experiment changes the photon involved from one having a superposition of "blue" and "red" characteristics to a photon that has only one of those characteristics. The problem that Einstein had with such an imagined situation was that if one of these photons had been kept bouncing between mirrors in a laboratory on earth, and the other one had traveled halfway to the nearest star, when its twin was made to reveal itself as either blue or red, that meant that the distant photon now had to lose its "purple" status too. So whenever it might be investigated after its twin had been measured, it would necessarily show up in the opposite state to whatever its twin had revealed.

In trying to show that quantum mechanics was not a complete theory, Einstein started with the theory's prediction that two or more particles that have interacted in the past can appear strongly correlated when their various properties are later measured. He sought to explain this seeming interaction in a classical way, through their common past, and preferably not by some "spooky action at a distance." The argument is worked out in a famous paper, Einstein, Podolsky, and Rosen (1935; abbreviated EPR), setting out what is now called the EPR paradox. Assuming what is now usually called local realism, EPR attempted to show from quantum theory that a particle has both position and momentum simultaneously, while according to the Copenhagen interpretation, only one of those two properties actually exists and only at the moment that it is being measured. EPR concluded that quantum theory is incomplete in that it refuses to consider physical properties which objectively exist in nature. (Einstein, Podolsky, & Rosen 1935 is currently Einstein's most cited publication in physics journals.) In the same year, Erwin Schrödinger used the word "entanglement" and declared: "I would not call that one but rather the characteristic trait of quantum mechanics." The question of whether entanglement is a real condition is still in dispute. The Bell inequalities are the most powerful challenge to Einstein's claims.

Quantum field theory


The idea of quantum field theory began in the late 1920s with British physicist Paul Dirac, when he attempted to quantise the electromagnetic field – a procedure for constructing a quantum theory starting from a classical theory.

A field in physics is "a region or space in which a given effect (such as magnetism) exists." Other effects that manifest themselves as fields are gravitation and static electricity. In 2008, physicist Richard Hammond wrote that

"Sometimes we distinguish between quantum mechanics (QM) and quantum field theory (QFT). QM refers to a system in which the number of particles is fixed, and the fields (such as the electromechanical field) are continuous classical entities. QFT ... goes a step further and allows for the creation and annihilation of particles . . .."

He added, however, that quantum mechanics is often used to refer to "the entire notion of quantum view."

In 1931, Dirac proposed the existence of particles that later became known as anti-matter. Dirac shared the Nobel Prize in physics for 1933 with Schrödinger, "for the discovery of new productive forms of atomic theory."

Quantum electrodynamics
Quantum electrodynamics (QED) is the name of the quantum theory of the electromagnetic force. Understanding QED begins with understanding electromagnetism. Electromagnetism can be called "electrodynamics" because it is a dynamic interaction between electrical and magnetic forces. Electromagnetism begins with the electric charge.

Electric charges are the sources of, and create, electric fields. An electric field is a field which exerts a force on any particles that carry electric charges, at any point in space. This includes the electron, proton, and even quarks, among others. As a force is exerted, electric charges move, a current flows and a magnetic field is produced. The magnetic field, in turn causes electric current (moving electrons). The interacting electric and magnetic field is called an electromagnetic field.

The physical description of interacting charged particles, electrical currents, electrical fields, and magnetic fields is called electromagnetism.

In 1928 Paul Dirac produced a relativistic quantum theory of electromagnetism. This was the progenitor to modern quantum electrodynamics, in that it had essential ingredients of the modern theory. However, the problem of unsolvable infinities developed in this relativistic quantum theory. Years later, renormalization solved this problem. Initially viewed as a suspect, provisional procedure by some of its originators, renormalization eventually was embraced as an important and self-consistent tool in QED and other fields of physics. Also, in the late 1940s Feynman's diagrams depicted all possible interactions pertaining to a given event. The diagrams showed that the electromagnetic force is the interactions of photons between interacting particles.

An example of a prediction of quantum electrodynamics which has been verified experimentally is the Lamb shift. This refers to an effect whereby the quantum nature of the electromagnetic field causes the energy levels in an atom or ion to deviate slightly from what they would otherwise be. As a result, spectral lines may shift or split.

In the 1960s physicists realized that QED broke down at extremely high energies. From this inconsistency the Standard Model of particle physics was discovered, which remedied the higher energy breakdown in theory. The Standard Model unifies the electromagnetic and weak interactions into one theory. This is called the electroweak theory.

Interpretations
The physical measurements, equations, and predictions pertinent to quantum mechanics are all consistent and hold a very high level of confirmation. However, the question of what these abstract models say about the underlying nature of the real world has received competing answers.

Applications
Applications of quantum mechanics include the laser, the transistor, the electron microscope, and magnetic resonance imaging. A special class of quantum mechanical applications is related to macroscopic quantum phenomena such as superfluid helium and superconductors. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics.

In even the simple light switch, quantum tunnelling is absolutely vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory chips found in USB drives also use quantum tunnelling, to erase their memory cells.

The quantisation of matter: the Bohr model of the atom
By the early 20th century, it was known that atoms consisted of a diffuse cloud of negatively-charged electrons surrounding a small, dense, positively-charged nucleus. This suggested a model in which the electrons circled around the nucleus like planets orbiting the sun. Unfortunately, it was also known that the atom in this model would be unstable: the orbiting electrons should give off electromagnetic radiation, causing them to lose energy and spiral towards the nucleus, colliding with it in a fraction of a second.

A second, related, puzzle was the emission spectrum of atoms. When a gas is heated, it gives off light at certain discrete frequencies. For example, the visible light given off by hydrogen consists of four different colours, as shown in the picture below. By contrast, white light contains light at the whole range of visible frequencies.

In 1913, Niels Bohr proposed a new model of the atom that included quantized electron orbits. This solution became known as the Bohr model of the atom. In Bohr's model, electrons could inhabit only certain orbits around the atomic nucleus. When an atom emitted or absorbed energy, the electron did not move in a continuous trajectory from one orbit around the nucleus to another, as might be expected in classical theory. Instead, the electron would jump instantaneously from one orbit to another, giving off light in the form of a photon. The possible energies of photons given off by each element were determined by the differences in energy between the orbits, and so the emission spectrum for each element would contain a number of lines. The Bohr model was able to explain the emission spectrum of hydrogen, but wasn't able to make accurate predictions for multi-electron atoms, or to explain why some spectral lines are brighter than others.

Atomic emission spectra
By the end of the nineteenth century it was known that atomic hydrogen would glow when excited, for example in an electric discharge. This light was found to be made up of only four wavelengths: the visible portion of hydrogen's emission spectrum.

In 1885 the Swiss mathematician Johann Balmer discovered that each wavelength λ in the visible spectrum of hydrogen is related to some integer n by the equation
 * $$\lambda = B\left(\frac{n^2}{n^2-4}\right) \qquad\qquad n = 3,4,5,6$$

where B is a constant which Balmer determined to be equal to 364.56 nm.

In 1888, Johannes Rydberg generalized and greatly increased the explanatory utility of Balmer's formula. He supposed that hydrogen will emit light of wavelength λ if λ is related to two integers n and m according to what is now known as the Rydberg formula:
 * $$ \frac{1}{\lambda} = R \left(\frac{1}{m^2} - \frac{1}{n^2}\right),$$

where R is the Rydberg constant, equal to 0.0110 nm−1, and n must be greater than m.

Rydberg's formula accounts for the four visible wavelengths by setting and. It also predicts additional wavelengths in the emission spectrum: for and for n &gt; 1, the emission spectrum should contain certain ultraviolet wavelengths, and for  and n &gt; 3, it should also contain certain infrared wavelengths. Experimental observation of these wavelengths came several decades later: in 1908 Louis Paschen found some of the predicted infrared wavelengths, and in 1914 Theodore Lyman found some of the predicted ultraviolet wavelengths. <!-- [Not sure what to do with these sentences now:] The basic rule produced by Balmer was improved by Rydberg and Ritz by solving a related formula for the inverse of the wavelength, and calculating the theoretical results provided good predictions. To see how the Rydberg-Ritz formula works, visit Hyperphysics and scroll down past the middle of the page.

In 1908, Walter Ritz discovered what has come to be known as the Ritz combination principle that demonstrates how new intervals among frequencies in a bright line spectrum can be discovered because there are several differences of frequencies between the energy states (or orbits) of electrons that keep repeating themselves. This principle is implicit in Heisenberg's breakthrough formulation of the new quantum mechanics in 1925.

The Bohr model of the atom


In 1913, Niels Bohr applied the notion of quantisation to electron orbits, particularly in the case of the hydrogen atom. Bohr theorised that the angular momentum, L, of an electron is quantised:
 * $$L = n\frac{h}{2\pi},$$

where n is a positive integer and h is the Planck constant. Starting from this assumption, Coulomb's law and the equations of circular motion show that an electron with n units of angular momentum will orbit a proton at a distance r given by
 * $$r = \frac{n^2 h^2}{4 \pi^2 k_e m e^2}$$,

where ke is the Coulomb constant, m is the mass of an electron, and e is the charge on an electron. For simplicity this is written as
 * $$r = n^2 a_0,\!$$

where a0, called the Bohr radius, is equal to 0.0529 nm. The Bohr radius is the radius of the smallest allowed orbit.

The energy of the electron can also be calculated, and is given by
 * $$E = -\frac{k_{\mathrm{e}}e^2}{2a_0} \frac{1}{n^2}$$.

Thus Bohr's assumption that angular momentum is quantised means that an electron can only inhabit certain orbits around the nucleus, and that it may have only certain energies. A consequence of these constraints is that the electron will not crash into the nucleus: it cannot continuously emit energy, and it cannot come closer to the nucleus than a0.

An electron can lose energy by jumping instantaneously from its original orbit to a lower orbit; the extra energy is emitted in the form of a photon. Conversely, an electron that absorbs a photon gains energy, so it jumps to an orbit that is farther from the nucleus.

Each photon from glowing atomic hydrogen is due to an electron moving from a higher orbit, with radius rn, to a lower orbit, rm. The energy Eγ of this photon is the difference in the energies En and Em of the electron:
 * $$E_{\gamma} = E_n - E_m = \frac{k_{\mathrm{e}}e^2}{2a_0}\left(\frac{1}{m^2}-\frac{1}{n^2}\right)$$

Since Planck's equation shows that the photon's energy is related to its wavelength by, the wavelengths of light which can be emitted are given by
 * $$\frac{1}{\lambda} = \frac{k_{\mathrm{e}}e^2}{2 a_0 h c}\left(\frac{1}{m^2}-\frac{1}{n^2}\right).$$

This equation has the same form as the Rydberg formula, and predicts that the constant R should be given by
 * $$R = \frac{k_{\mathrm{e}}e^2}{2 a_0 h c} .$$

Therefore the Bohr model of the atom can predict the emission spectrum of hydrogen in terms of fundamental constants.