User:FT2/SM

refimprove expert

The history of the Standard Model of particle physics describes the development, over time, of the current prevailing scientific theory describing many known physical fundamental forces and particles.

In particle physics, elementary particles and forces give rise to the world around us. Physicists explain the behaviour of these particles and how they interact using the Standard Model—a widely accepted and "remarkably" accurate framework based on quantum fields and symmetries believed to explain most of the world we see around us.

This theory developed gradually and piecemeal, evolving from attempts to develop and unify initially separate field theories such as quantum electrodynamics (QED) and the strong interaction, finally reaching something like its current form and status during the 1970s once a quantum gauge field theory capable of unifying two of the known four fundamental forces - the electromagnetic and weak forces - was developed. The unified force was known as the electroweak interaction. Within a few years the electroweak model was merged with quantum chromodynamics, which very accurately describes the strong interaction, to produce a mathematical model describing three of the four known fundamental forces. (The fourth known force, gravity, is not yet part of a unified model).

As a scientific theory the Standard Model is continually explored and developed to take account of known possibilities and experimental findings, as well as being tested to ensure its predictions are accurate when measured in reality - a test of falsifiability that ensures it actually gives results and predictions that occur in nature. Tests of the Standard Model have shown "remarkably accurate" predictions being met in reality, however there is a widespread sense that the Standard Model itself is incomplete, insufficient, or can lead to inconsistencies, prompting a belief that "new" physics should be discovered or theorized in future to extend or replace the Standard Model. Extensions and alternative theories currently exist that take these theories further - known as "physics beyond the Standard Model" - however it is not known which if any of these will be borne out by experimental data.

Pre-1950s physics
The direct precursors to the Standard Model are found in work from the 1950s and 1960s. Prior to around 1950, physics had gone through a number of stages, discoveries, and formulations making these possible.

The modern physics revolution (c.1890 - 1940)
Classical mechanics, thermodynamics, optics, and various conservation laws had been developed to a high level and formulated elegantly, to the point that until the final years of the 19th century these were widely believed to explain most or all important physical matters.

This belief had then been abruptly overturned by a series of new insights, theories and discoveries - the paradigm changing discoveries of the late 19th and early 20th centuries such as the discovery of quantization of energy, quantum mechanics, matter-energy equivalence, special and general relativity, closer study of condensed matter physics, and discovery of subatomic particles. A revolution in knowledge took place. While some of these understandings remained controversial or debated for many years, the new principles and understandings in these fields gradually became the core of modern physics.

Reconceptualization of fundamental theories and the development of quantum field theory
The first field theories had been developed over many decades, and showed great promise. Initially electromagnetism and then gravity were described within classical field theory; the discovery of special relativity and Lorentz covariance (now recognised as a fundamental aspect of nature) led to covariant classical field theory in which Lagrangians can give rise to field equations and a conservation law for the theory. Shortly after, around 1927, Paul Dirac made the first attempt to integrate quantum mechanics and field theory, leading to initial development of early quantum field theory. Subsequent work included proof that quantum fields could be consistent with special relativity during coordinate transformations (Jordan and Pauli showed Lorentz invariance in 1928), the Dirac equation in 1930, and the works of Ambartsumian and Ivanenko into creation of massive particles and particle birth and disappearance during interactions (the Ambarzumian-Ivanenko hypothesis, 1930, and other works)

Field theories and the discovery of subatomic particles, in turn had led to the concept of exchange particles, in which forces were now considered as sometimes being mediated by particles, and forces and particles themselves were at times recast as phenomenae of more fundamental fields and their symmetries.

As a result of the above, by around 1950 physicists had developed simple yet powerful formulations of theory for entire areas of physics. Areas such as electromagnetism and (in the 20th century) quantum electrodynamics (a relativistic quantum field theory of electrodynamics reconciling quantum mechanics and special relativity for electromagnetism) had been developed, as well as some kinds of theories of superconductivity. The concept had emerged long ago that perhaps, a so-called "Grand Unified Theory" or "theory of everything", might exist that allowed all of physics to be brought into one theoretical concept.

Theoretical problems (c.1950)
Much of the theoretical advances described above were still relatively new or still being explored at the start of the 1950s. A number of areas were still disparate or undeveloped in modern terms, and in some areas it was unclear what theory might be best used to explain observations. (For example an ever-increasing number of particles were being identified, yet it was unclear what logic or rationale might explain or provide structure to this knowledge). Satisfactory theories were developed in some areas but not others, and conceptually satisfactory explanations were also lacking for some areas.

While field theories seemed promising, approaches at the time were also plagued by serious theoretical and mathematical difficulties, such as difficulties computing even basic results including the self-energy of the electron - these appeared to give nonsensical or infinite - or "divergent" - results when computed using the techniques available in the 1930s and most of the 1940s. (Some, such as the electron self-energy problem were already a serious issue in classical theory). There was little or no formal evidence for which kinds of mathematical approaches might be "more correct" than others. The situation was dire, and had certain features that reminded many of the ultraviolet catastrophe (or "Rayleigh-Jeans difficulty") of classical theory. What made the situation in the 1940s so desperate and gloomy, however, was the fact that the correct ingredients (the second-quantized Maxwell-Dirac field equations) for the theoretical description of interacting photons and electrons were well in place, and no major conceptual change was needed analogous to that which was necessitated by a finite and physically sensible account of the radiative behavior of hot objects, as provided by the Planck radiation law.

Inroads into this "divergence problem" were made during the late 1940s and early 1950s, and a solution found for quantum electrodynamics at least, through the procedure known as renormalization, although it still remained a grave and unresolved problem for such theories generally.

1950s to c. 1961: Yang-Mills theory and Goldstone's theorem
By the early 1950s, particle physicists had turned from classical physics to instead study matter made from fundamental particles whose interactions were mediated by exchange particles known as force carriers, and a number of these particles had been discovered or proposed, along with theories suggesting how they relate to each other. Some of these theories had already been reformulated as field theories in which the objects of study are not particles and forces, but quantum fields and their symmetries. However, attempts to unify known fundamental forces such as the electromagnetic force and the weak nuclear force were known to be incomplete.

One known omission was that gauge invariant approaches, including non-abelian models such as Yang–Mills theory (1954), which held great promise for unified theories, also seemed to predict known massive particles as massless, and predicted massless spin-1 bosons would exist. Goldstone's theorem, relating to continuous symmetries within some theories, also appeared to rule out many obvious solutions, and it too appeared to show that zero-mass particles and new forces would have to also exist that were "simply not seen". According to Guralnik, physicists had "no understanding" how these problems could be overcome.

Steven Weinberg described the mood more darkly, as a time of "frustration and confusion" following the success of quantum electrodynamics, in which attempts to apply these methods to other forces failed badly, mathematical divergences and nonsensical results seemed perennial, and that deeper problems loomed in that even when theories were attempted, there seemed: "no rationale for any of these theories. The weak interaction theory was simply cobbled together to fit what experimental data was available and there was no evidence at all for any particular theory of the strong interaction." Theoreticians became disillusioned with quantum field theory and looked for other ways out of these problems, notably strong interaction theories largely derived from dynamics and S-matrix theory, and solutions based on symmetries which at least allowed predictions to be made even in the absence of any real understanding. However the so-called "mass problem" seemed insurmountable.

The Higgs mechanism and electroweak theory (c. 1960 - 1972)
The breakthrough to these problems - which ultimately led to the formulation of the Standard Model and the final acceptance of gauge field theories in the 1970s - took place in several stages over around 12 years. For much of that time, the discoveries and theoretical steps involved were given minor mention and only in hindsight seen as landmarks. For example, Weinberg's seminal paper which created the electroweak theory and ultimately won him a Nobel Prize was cited a mere handful of times in its first 3 years; similarly the crucial 1964 PRL papers which proved a way might exist to bypass Goldstone's theorem and the mass problem, triggered a range of interested research but were not seen at that time as momentous. In the subsequent 30 years a number of Nobel prizes were awarded for these works, and in 2010, the 1964 papers were finally recognized as milestone papers by PRL's 50th anniversary celebration. Additionally, all of the six physicists were awarded the 2010 J. J. Sakurai Prize for Theoretical Particle Physics for this work.

Particle physicist and mathematician Peter Woit summarised the state of research in the early 1960s:


 * "Yang and Mills work on non-abelian gauge theory had one huge problem: in perturbation theory it has massless particles which don’t correspond to anything we see. One way of getting rid of this problem is now fairly well-understood, the phenomenon of confinement realized in QCD, where the strong interactions get rid of the massless “gluon” states at long distances. By the very early sixties, people had begun to understand another source of massless particles: spontaneous symmetry breaking of a continuous symmetry. What Philip Anderson realized and worked out in the summer of 1962 was that, when you have both gauge symmetry and spontaneous symmetry breaking, the Nambu-Goldstone massless mode can combine with the massless gauge field modes to produce a physical massive vector field. This is what happens in superconductivity, a subject about which Anderson was (and is) one of the leading experts." [text condensed]

According to Weinberg, the theoretical breakthroughs of particle physics in the 1960s were based on a few key ideas - the quark model, local ("gauge") symmetries, and spontaneous symmetry breaking (technically: symmetries of the Lagrangian might exist that are not global symmetries of the vacuum). There had also been fundamental points of confusion as physicists grappled with 1950s results; the concept of 'approximate' symmetries were one such issue. These new theoretical approaches quickly led to a description of the Higgs mechanism (as it later became known) - a process by which rest mass might arise and the massless particles (whether Yang-Mills or Goldstone) be avoided in a gauge theory without explicitly breaking gauge invariance.

The mechanism is the key element of the electroweak theory that forms part of the Standard Model of particle physics, and of many models, such as the Grand Unified Theory, that go beyond it.

The Higgs mechanism is a process by which vector bosons can get rest mass without explicitly breaking gauge invariance. The proposal for such a spontaneous symmetry breaking mechanism originally was suggested in 1962 by Philip Warren Anderson and developed into a full relativistic model, independently and almost simultaneously, by three groups of physicists: by François Englert and Robert Brout in August 1964; by Peter Higgs in October 1964; and by Gerald Guralnik, C. R. Hagen, and Tom Kibble (GHK) in November 1964. Properties of the model were further considered by Guralnik in 1965, by Higgs in 1966, by Kibble in 1967, and further by GHK in 1967. The papers confirmed that Yangs-Mills particle theories could work despite Goldstone's theorem, and that when a gauge theory is combined with an additional field that spontaneously breaks the symmetry, massless bosons might not appear. Instead, the long-range Yang-Mills gauge fields and their bosons might interact with the massless bosons from a broken symmetry; the gauge bosons could then consistently acquire a finite mass. The requisite field would need an unusual "Mexican hat" potential; if so then below a certain very high energy level, minimum energy is achieved not at its centre (or "zero point"), but at a different (non-zero) point causing a transition: local symmetry spontaneously breaks, which triggers the acquisition of a consistent mass by particles interacting with the field, but without producing additional massless particles.CITE


 * The original papers showed that when a gauge theory is combined with an additional field that spontaneously breaks the symmetry, the gauge bosons can consistently acquire a finite mass.  In 1967, Steven Weinberg and Abdus Salam showed how a Higgs mechanism could be used to break the electroweak symmetry of Sheldon Glashow's unified model for the weak and electromagnetic interactions, forming what became the Standard Model of particle physics. Weinberg was the first to observe that this would also provide mass terms for the fermions. 

A parallel theory existed in Anderson's field, superconductivity, and Nambu-Goldstone bosons were not observed when solid state symmetries were spontaneously broken, so the idea was not completely unprecedented.CITE However these new ideas took until the early 1970s to win full acceptance.CITE The first major step was in 1967, when Steven Weinberg and Abdus Salam independently showed how a Higgs mechanism could be used to break the electroweak symmetry of Sheldon Glashow's unified model for the weak and electromagnetic interactions, forming what became the Standard Model of particle physics. Weinberg had been working on Higgs mechanism explanations for the strong interaction but then switched to leptons and the electroweak interaction. Weinberg was the first to observe that this would also provide mass terms for fermions, and believed because of its origins that the theory should be renormalisable, but did not prove this.CITE

The seminal 1960s papers on spontaneous breaking of gauge symmetries were still largely ignored at first, because it was widely believed that the (non-Abelian gauge) theories in question were fundamentally flawed, because they were not renormalizable. This rapidly changed when Gerard 't Hooft and Tini Veltman explicitly demonstrated the renormalizability of spontaneously broken gauge theories in 1971-2, and Benjamin Lee's development of a simpler proof of renormalisation and a series of "influential" papers with collaborators (1972-73) which popularised the concepts,CITE after which the ideas were quickly absorbed in the mainstream.

In 1961, Glashow extended electroweak unification models due to Schwinger by including a short range neutral current, the Z0. The resulting symmetry structure that Glashow proposed, SU(2) × U(1), forms the basis of the accepted theory of the electroweak interactions.

However, the seminal papers on spontaneous breaking of gauge symmetries were at first largely ignored, because it was widely believed that the (non-Abelian gauge) theories in question were fundamentally flawed, because they were not renormalizable. This rapidly changed when Gerard 't Hooft and Tini Veltman explicitly demonstrated the renormalizability of spontaneously broken gauge theories, after which the ideas were quickly absorbed in the mainstream.

The mechanism, field, and theory in the 1964 papers became central to modern particle physics around 8 years later (c. 1972), following subsequent work by a number of other theorists: - Weinberg (and almost simultaneosuly Salam) applied the 1964 papers' theory to the electroweak interaction in his seminal 1967 paper "A Model of Leptons", 't Hooft and Veltman overcame the last major perceived hurdle in 1971-72 by proving that the resulting theory was renormalisable and could produce "sensible" results, and Lee subsequently published a number of papers on these theories and popularised them around the same time, by which time they had begun to win mainstream recognition. These researches, now considered milestones, did not initially gain much recognition until renormalization was confirmed. For example, Weinberg's 1967 paper, which as of 2012 ranks as the top cited paper in the field of high energy physics, was cited only a handful of times in its first 3 years. Several of these works subsequently resulted in Nobel Prizes in their own right, and a further prize is widely expected to be awarded if the Higgs boson's existence is proven.

The three papers written in 1964 were each recognised as milestone papers during Physical Review Letters 50th anniversary celebration. Their six authors were also awarded the 2010 J. J. Sakurai Prize for Theoretical Particle Physics for this work. (A controversy also arose the same year, because in the event of a Nobel Prize only up to three scientists could be recognised, with six being credited for the papers. ) Two of the three PRL papers (by Higgs and by GHK) contained equations for the hypothetical field that eventually would become known as the Higgs field and its hypothetical quantum, the Higgs boson. Higgs's subsequent 1966 paper showed the decay mechanism of the boson; only a massive boson can decay and the decays can prove the mechanism.

In the paper by Higgs the boson is massive, and in a closing sentence Higgs writes that "an essential feature" of the theory "is the prediction of incomplete multiplets of scalar and vector bosons". In the paper by GHK the boson is massless and decoupled from the massive states. In reviews dated 2009 and 2011, Guralnik states that in the GHK model the boson is massless only in a lowest-order approximation, but it is not subject to any constraint and acquires mass at higher orders, and adds that the GHK paper was the only one to show that there are no massless Goldstone bosons in the model and to give a complete analysis of the general Higgs mechanism.

In addition to explaining how mass is acquired by vector bosons, the Higgs mechanism also predicts the ratio between the W boson and Z boson masses as well as their couplings with each other and with the Standard Model quarks and leptons. Subsequently, many of these predictions have been verified by precise measurements performed at the LEP and the SLC colliders, thus overwhelmingly confirming that some kind of Higgs mechanism does take place in nature, but the exact manner by which it happens has not yet been discovered. The results of searching for the Higgs boson are expected to provide evidence about how this is realized in nature.