Experiments of Rayleigh and Brace

The experiments of Rayleigh and Brace (1902, 1904) were aimed to show whether length contraction leads to birefringence or not. They were some of the first optical experiments measuring the relative motion of Earth and the luminiferous aether which were sufficiently precise to detect magnitudes of second order to v/c. The results were negative, which was of great importance for the development of the Lorentz transformation and consequently of the theory of relativity. See also Tests of special relativity.

The experiments
To explain the negative outcome of the Michelson–Morley experiment, George FitzGerald (1889) and Hendrik Lorentz (1892) introduced the contraction hypothesis, according to which a body is contracted during its motion through the stationary aether.

Lord Rayleigh (1902) interpreted this contraction as a mechanical compression which should lead to optical anisotropy of materials, so the different refraction indices would cause birefringence. To measure this effect, he installed a tube of length 76 cm upon a rotatable table. The tube was closed by glass at its ends, and was filled with carbon bisulphide or water, and the liquid was between two nicol prisms. Through the liquid, light (produced by an electric lamp and more importantly by limelight) was sent to and fro. The experiment was sufficiently precise to measure retardations of $1⁄6000$ of a half wavelength, i.e. of the order $1.2$. Depending on the direction relative to Earth's motion, the expected retardation due to birefringence was of order 10−8, which was well within the accuracy of the experiment. Therefore, it was, besides the Michelson–Morley experiment and the Trouton–Noble experiment, one of the few experiments by which magnitudes of second order in v/c could be detected. However, the result was completely negative. Rayleigh repeated the experiments with layers of glass plates (although with a diminished precision by a factor of 100), and again obtained a negative result.

However, those experiments were criticized by DeWitt Bristol Brace (1904). He argued that Rayleigh hadn't properly considered the consequences of contraction ($0.5$ instead of $$) as well as of the refraction index, so that the results were not conclusive. Therefore, Brace conducted experiments of much higher precision. He employed an apparatus that was 4.13 m long, 15 cm wide, and 27 cm deep, which was filled with water, and which could be rotated (depending on the experiment) about a vertical or a horizontal axis. Sun light was directed into the water through a system of lenses, mirrors and reflexion prisms, and was reflected 7 times so that it traversed 28.5 m. In this way, a retardation of order $7.8$ was observable. However, also Brace obtained a negative result. Another experimental installation with glass instead of water (precision: $4.5$), also yielded no sign of birefringence.



The absence of birefringence was initially interpreted by Brace as a refutation of length contraction. However, it was shown by Lorentz (1904) and Joseph Larmor (1904) that when the contraction hypothesis is maintained and the complete Lorentz transformation is employed (i.e. including the time transformation), then the negative outcome can be explained. Furthermore, if the relativity principle is considered as valid from the outset, as in Albert Einstein's theory of special relativity (1905), then the result is quite clear, since an observer in uniform translational motion can consider himself as at rest, and consequently won't experience any effect of his own motion. Length contraction is thus not measurable by a comoving observer, and has to be supplemented by time dilation for non-comoving observers, which was subsequently also confirmed by the Trouton–Rankine experiment (1908) and the Kennedy–Thorndike experiment (1932).