Variable speed of light

A variable speed of light (VSL) is a feature of a family of hypotheses stating that the speed of light may in some way not be constant, for example, that it varies in space or time, or depending on frequency. Accepted classical theories of physics, and in particular general relativity, predict a constant speed of light in any local frame of reference and in some situations these predict apparent variations of the speed of light depending on frame of reference, but this article does not refer to this as a variable speed of light. Various alternative theories of gravitation and cosmology, many of them non-mainstream, incorporate variations in the local speed of light.

Attempts to incorporate a variable speed of light into physics were made by Robert Dicke in 1957, and by several researchers starting from the late 1980s.

VSL should not be confused with faster than light theories, its dependence on a medium's refractive index or its measurement in a remote observer's frame of reference in a gravitational potential. In this context, the "speed of light" refers to the limiting speed c of the theory rather than to the velocity of propagation of photons.

Background
Einstein's equivalence principle, on which general relativity is founded, requires that in any local, freely falling reference frame, the speed of light is always the same. This leaves open the possibility, however, that an inertial observer inferring the apparent speed of light in a distant region might calculate a different value. Spatial variation of the speed of light in a gravitational potential as measured against a distant observer's time reference is implicitly present in general relativity. The apparent speed of light will change in a gravity field and, in particular, go to zero at an event horizon as viewed by a distant observer. In deriving the gravitational redshift due to a spherically symmetric massive body, a radial speed of light dr/dt can be defined in Schwarzschild coordinates, with t being the time recorded on a stationary clock at infinity. The result is
 * $$ \frac{dr}{dt} = 1 - \frac{2m}{r}, $$

where m is MG/c2 and where natural units are used such that c0 is equal to one.

Dicke's proposal (1957)
Robert Dicke, in 1957, developed a VSL theory of gravity, a theory in which (unlike general relativity) the speed of light measured locally by a free-falling observer could vary. Dicke assumed that both frequencies and wavelengths could vary, which since $$ c = \nu \lambda $$ resulted in a relative change of c. Dicke assumed a refractive index $$ n= \frac{c}{c_0} = 1+\frac{2 GM}{r c^2} $$ (eqn. 5) and proved it to be consistent with the observed value for light deflection. In a comment related to Mach's principle, Dicke suggested that, while the right part of the term in eq. 5 is small, the left part, 1, could have "its origin in the remainder of the matter in the universe".

Given that in a universe with an increasing horizon more and more masses contribute to the above refractive index, Dicke considered a cosmology where c decreased in time, providing an alternative explanation to the cosmological redshift.

Subsequent proposals
Variable speed of light models, including Dicke's, have been developed which agree with all known tests of general relativity.

Other models make a link to Dirac's large numbers hypothesis.

Several hypotheses for varying speed of light, seemingly in contradiction to general relativity theory, have been published, including those of Giere and Tan (1986) and Sanejouand (2009). In 2003, Magueijo gave a review of such hypotheses.

Cosmological models with varying speeds of light have been proposed independently by Jean-Pierre Petit in 1988, John Moffat in 1992, and the team of Andreas Albrecht and João Magueijo in 1998 to explain the horizon problem of cosmology and propose an alternative to cosmic inflation.

Gravitational constant G
In 1937, Paul Dirac and others began investigating the consequences of natural constants changing with time. For example, Dirac proposed a change of only 5 parts in 1011 per year of the Newtonian constant of gravitation G to explain the relative weakness of the gravitational force compared to other fundamental forces. This has become known as the Dirac large numbers hypothesis.

However, Richard Feynman showed that the gravitational constant most likely could not have changed this much in the past 4 billion years based on geological and solar system observations, although this may depend on assumptions about G varying in isolation. (See also strong equivalence principle.)

Fine-structure constant α
One group, studying distant quasars, has claimed to detect a variation of the fine-structure constant at the level in one part in 105. Other authors dispute these results. Other groups studying quasars claim no detectable variation at much higher sensitivities.

The natural nuclear reactor of Oklo has been used to check whether the atomic fine-structure constant α might have changed over the past 2 billion years. That is because α influences the rate of various nuclear reactions. For example, 149samarium captures a neutron to become 150samarium, and since the rate of neutron capture depends on the value of α, the ratio of the two samarium isotopes in samples from Oklo can be used to calculate the value of α from 2 billion years ago. Several studies have analysed the relative concentrations of radioactive isotopes left behind at Oklo, and most have concluded that nuclear reactions then were much the same as they are today, which implies α was the same too.

Paul Davies and collaborators have suggested that it is in principle possible to disentangle which of the dimensionful constants (the elementary charge, the Planck constant, and the speed of light) of which the fine-structure constant is composed is responsible for the variation. However, this has been disputed by others and is not generally accepted.

Dimensionless and dimensionful quantities
To clarify what a variation in a dimensionful quantity actually means, since any such quantity can be changed merely by changing one's choice of units, John Barrow wrote:
 * "[An] important lesson we learn from the way that pure numbers like α define the world is what it really means for worlds to be different. The pure number we call the fine-structure constant and denote by α is a combination of the electron charge, e, the speed of light, c, and the Planck constant, h. At first we might be tempted to think that a world in which the speed of light was slower would be a different world. But this would be a mistake. If c, h, and e were all changed so that the values they have in metric (or any other) units were different when we looked them up in our tables of physical constants, but the value of α remained the same, this new world would be observationally indistinguishable from our world. The only thing that counts in the definition of worlds are the values of the dimensionless constants of Nature. If all masses were doubled in value [including the Planck mass mP] you cannot tell because all the pure numbers defined by the ratios of any pair of masses are unchanged."

Any equation of physical law can be expressed in a form in which all dimensional quantities are normalized against like-dimensioned quantities (called nondimensionalization), resulting in only dimensionless quantities remaining. Physicists can choose their units so that the physical constants c, G, ħ = h/(2π), 4πε0, and kB take the value one, resulting in every physical quantity being normalized against its corresponding Planck unit. For that, it has been claimed that specifying the evolution of a dimensional quantity is meaningless and does not make sense. When Planck units are used and such equations of physical law are expressed in this nondimensionalized form, no dimensional physical constants such as c, G, ħ, ε0, nor kB remain, only dimensionless quantities, as predicted by the Buckingham π theorem. Short of their anthropometric unit dependence, there is no speed of light, gravitational constant, nor the Planck constant, remaining in mathematical expressions of physical reality to be subject to such hypothetical variation. For example, in the case of a hypothetically varying gravitational constant, G, the relevant dimensionless quantities that potentially vary ultimately become the ratios of the Planck mass to the masses of the fundamental particles. Some key dimensionless quantities (thought to be constant) that are related to the speed of light (among other dimensional quantities such as ħ, e, ε0), notably the fine-structure constant or the proton-to-electron mass ratio, could in principle have meaningful variance and their possible variation continues to be studied.

General critique of varying c cosmologies
From a very general point of view, G. F. R. Ellis and Jean-Philippe Uzan expressed concerns that a varying c would require a rewrite of much of modern physics to replace the current system which depends on a constant c. Ellis claimed that any varying c theory (1) must redefine distance measurements; (2) must provide an alternative expression for the metric tensor in general relativity; (3) might contradict Lorentz invariance; (4) must modify Maxwell's equations; and (5) must be done consistently with respect to all other physical theories. VSL cosmologies remain out of mainstream physics.