Phase-contrast imaging

Phase-contrast imaging is a method of imaging that has a range of different applications. It measures differences in the refractive index of different materials to differentiate between structures under analysis. In conventional light microscopy, phase contrast can be employed to distinguish between structures of similar transparency, and to examine crystals on the basis of their double refraction. This has uses in biological, medical and geological science. In X-ray tomography, the same physical principles can be used to increase image contrast by highlighting small details of differing refractive index within structures that are otherwise uniform. In transmission electron microscopy (TEM), phase contrast enables very high resolution (HR) imaging, making it possible to distinguish features a few Angstrom apart (at this point highest resolution is 40 pm ).

Atomic Physics
Phase-contrast imaging is commonly used in atomic physics to describe a range of techniques for dispersively imaging ultracold atoms. Dispersion is the phenomena of the propagation of electromagnetic fields (light) in matter. In general, the refractive index of a material, which alters the phase velocity and refraction of the field, depends on the wavelength or frequency of the light. This is what gives rise to the familiar behavior of prisms, which are seen to split light into its constituent wavelengths. Microscopically, we may think of this behavior as arising from the interaction of the electromagnetic wave with the atomic dipoles. The oscillating force field in turn causes the dipoles to oscillate and in doing so reradiate light with the same polarization and frequency, albeit delayed or phase-shifted from the incident wave. These waves interfere to produce the altered wave which propagates through the medium. If the light is monochromatic (that is, an electromagnetic wave of a single frequency or wavelength), with a frequency close to an atomic transition, the atom will also absorb photons from the light field, reducing the amplitude of the incident wave. Mathematically, these two interaction mechanisms (dispersive and absorptive) are commonly written as the real and complex parts, respectively, of a Complex refractive index.

Dispersive imaging refers strictly to the measurement of the real part of the refractive index. In phase contrast-imaging, a monochromatic probe field is detuned far away from any atomic transitions to minimize absorption and shone onto an atomic medium (such as a Bose-condensed gas). Since absorption is minimized, the only effect of the gas on the light is to alter the phase of various points along its wavefront. If we write the incident electromagnetic field as

$$ \mathbf{E}_{i} = \hat{\mathbf{x}}E_0 e^{i(\omega_0 t - kz)} $$

then the effect of the medium is to phase shift the wave by some amount $$ \Phi $$ which is in general a function of $$ (x,y) $$ in the plane of the object (unless the object is of homogenous density, i.e. of constant index of refraction), where we assume the phase shift to be small, such that we can neglect refractive effects:

$$ \mathbf{E}_{i} \to \mathbf{E}_{PM} = \hat{\mathbf{x}}E_0 e^{i(\omega_0 t - kz + \Phi)} $$

We may think of this wave as a superposition of smaller bundles of waves each with a corresponding phase shift $$ \phi(x,y) $$:

$$ \mathbf{E}_{PM} = \hat{\mathbf{x}}\frac{E_0}{A_o} \int_{(x,y)} e^{i(\omega_0 t - kz + \phi(x,y))}dx dy $$

where $$ A_o $$ is a normalization constant and the integral is over the area of the object plane. Since $$ \phi(x,y) $$ is assumed to be small, we may expand that part of the exponential to first order such that

$$\begin{align} \mathbf{E}_{PM} &\to \hat{\mathbf{x}}\frac{E_0}{A_o} e^{i(\omega_0 t - kz)}\int_{(x,y)}(1 + i\phi(x,y)) dx dy\\ &= \hat{\mathbf{x}}E_0\bigg[\cos(\omega_0 t - kz) - \frac{\tilde{\phi}}{A_o} \sin(\omega_0 t - kz) + i\bigg(\frac{\tilde{\phi}}{A_o} \cos(\omega_0 t - kz) + \sin(\omega_0 t - kz)\bigg)\bigg] \end{align}$$

where $$ \tilde{\phi} = \int \phi(x,y) dxdy $$ represents the integral over all small changes in phase to the wavefront due to each point in the area of the object. Looking at the real part of this expression, we find the sum of a wave with the original unshifted phase $$ \omega_0t - kz $$, with a wave that is $$ \pi/2 $$ out of phase and has very small amplitude $$ \frac{\tilde{\phi}}{A_o} $$. As written, this is simply another complex wave $$ E_0 e^{i\xi} $$ with phase

$$ \xi = \arctan\bigg(\frac{\frac{\tilde{\phi}}{A_o} \cos(\omega_0 t - kz) + \sin(\omega_0 t - kz)}{\cos(\omega_0 t - kz) - \frac{\tilde{\phi}}{A_o} \sin(\omega_0 t - kz)}\bigg) $$

Since imaging systems see only changes in the intensity of the electromagnetic waves, which is proportional to the square of the electric field, we have $$ I_{PM} \propto |\mathbf{E}_{PM}|^2 = |\hat{\mathbf{x}}E_0 e^{i\xi}|^2 = E_0^2 = |\mathbf{E}_{i}|^2 = |\hat{\mathbf{x}}E_0 e^{i(\omega_0 t - kz)}|^2 = E_0^2$$. We see that both the incident wave and the phase shifted wave are equivalent in this respect. Such objects, which only impart phase changes to light which pass through them, are commonly referred to as phase objects, and are for this reason invisible to any imaging system. However, if we look more closely at the real part of our phase shifted wave

$$ \Re[\mathbf{E}_{PM}] = \hat{\mathbf{x}}E_0\bigg[\cos(\omega_0 t - kz) - \frac{\tilde{\phi}}{A_o} \sin(\omega_0 t - kz)\bigg] $$

and suppose we could shift the term unaltered by the phase object (the cosine term) by $$ \pi / 2 $$, such that $$ \cos(\omega_0 t - kz) \to \cos(\omega_0 t - kz + \pi/2) = \sin(\omega_0 t - kz) $$, then we have

$$ \Re[\mathbf{E}_{PM}] = \hat{\mathbf{x}}E_0\bigg(1-\frac{\tilde{\phi}}{A_o}\bigg)\sin(\omega_0 t - kz) $$

The phase shifts due to the phase object are effectively converted into amplitude fluctuations of a single wave. These would be detectable by an imaging system since the intensity is now $$ I \propto E_0^2 (1-\tilde{\phi}/A_o)^2$$. This is the basis of the idea of phase contrast imaging. As an example, consider the setup shown in the figure on the right.



A probe laser is incident on a phase object. This could be an atomic medium such as a Bose-Einstein Condensate. The laser light is detuned far from any atomic resonance, such that the phase object only alters the phase of various points along the portion of the wavefront which pass through the object. The rays which pass through the phase object will diffract as a function of the index of refraction of the medium and diverge as shown by the dotted lines in the figure. The objective lens collimates this light, while focusing the so-called 0-order light, that is, the portion of the beam unaltered by the phase object (solid lines). This light comes to a focus in the focal plane of the objective lens, where a Phase plate can be positioned to delay only the phase of the 0-order beam, bringing it back into phase with the diffracted beam and converting the phase alterations in the diffracted beam into intensity fluctuations at the imaging plane. The phase plate is usually a piece of glass with a raised center encircled by a shallower etch, such that light passing through the center is delayed in phase relative to that passing through the edges.

Polarization Contrast Imaging (Faraday Imaging)
In polarization contrast imaging, the Faraday effect of the light-matter interaction is leveraged to image the cloud using a standard absorption imaging setup altered with a far detuned probe beam and an extra polarizer. The Faraday effect rotates a linear probe beam polarization as it passes through a cloud polarized by a strong magnetic field in the propagation direction of the probe beam.

Classically, a linearly polarized probe beam may be thought of as a superposition of two oppositely handed, circularly polarized beams. The interaction between the rotating magnetic field of each probe beam interacts with the magnetic dipoles of atoms in the sample. If the sample is magnetically polarized in a direction with non-zero projection onto the light field k-vector, the two circularly polarized beams will interact with the magnetic dipoles of the sample with different strengths, corresponding to a relative phase shift between the two beams. This phase shift in turns maps to a rotation of the input beam linear polarization.

The quantum physics of the Faraday interaction may be described by the interaction of the second quantized Stokes parameters describing the polarization of a probe light field with the total angular momentum state of the atoms. Thus, if a BEC or other cold, dense sample of atoms is prepared in a particular spin (hyperfine) state polarized parallel to the imaging light propagation direction, both the density and change in spin state may be monitored by feeding the transmitted probe beam through a beam splitter before imaging onto a camera sensor. By adjusting the polarizer optic axis relative to the input linear polarization one can switch between a dark field scheme (zero light in the absence of atoms), and variable phase contrast imaging.

Dark-field and other methods
In addition to phase-contrast, there are a number of other similar dispersive imaging methods. In the dark-field method, the aforementioned phase plate is made completely opaque, such that the 0-order contribution to the beam is totally removed. In the absence of any imaging object the image plane would be dark. This amounts to removing the factor of 1 in the equation

$$ \Re[\mathbf{E}_{PM}] = \hat{\mathbf{x}}E_0\bigg(1-\frac{\tilde{\phi}}{A_o}\bigg)\sin(\omega_0 t - kz) \to \hat{\mathbf{x}}E_0\frac{\tilde{\phi}}{A_o}\sin(\omega_0 t - kz) $$

from above. Comparing the squares of the two equations one will find that in the case of dark-ground, the range of contrast (or dynamic range of the intensity signal) is actually reduced. For this reason this method has fallen out of use.

In the defocus-contrast method, the phase plate is replaced by a defocusing of the objective lens. Doing so breaks the equivalence of parallel ray path lengths such that a relative phase is acquired between parallel rays. By controlling the amount of defocusing one can thus achieve an effect similar to that of the phase plate in standard phase-contrast. In this case however the defocusing scrambles the phase and amplitude modulation of the diffracted rays from the object in such a way that does not capture the exact phase information of the object, but produces an intensity signal proportional to the amount of phase noise in the object.

There is also another method, called bright-field balanced (BBD) method. This method leverages the complementary intensity changes of transmitted disks at different scattering angles that provide straightforward, dose-efficient, and noise-robust phase imaging from atomic resolution to intermediate length scales, such as both light and heavy atomic columns and nanoscale magnetic phases in FeGe samples.

Light microscopy
Phase contrast takes advantage of the fact that different structures have different refractive indices, and either bend, refract or delay the light passage through the sample by different amounts. The changes in the light passage result in waves being 'out of phase' with others. This effect can be transformed by phase contrast microscopes into amplitude differences that are observable in the eyepieces and are depicted effectively as darker or brighter areas of the resultant image.

Phase contrast is used extensively in optical microscopy, in both biological and geological sciences. In biology, it is employed in viewing unstained biological samples, making it possible to distinguish between structures that are of similar transparency or refractive indices.

In geology, phase contrast is exploited to highlight differences between mineral crystals cut to a standardised thin section (usually 30 μm) and mounted under a light microscope. Crystalline materials are capable of exhibiting double refraction, in which light rays entering a crystal are split into two beams that may exhibit different refractive indices, depending on the angle at which they enter the crystal. The phase contrast between the two rays can be detected with the human eye using particular optical filters. As the exact nature of the double refraction varies for different crystal structures, phase contrast aids in the identification of minerals.

X-ray imaging
There are four main techniques for X-ray phase-contrast imaging, which use different principles to convert phase variations in the X-rays emerging from the object into intensity variations at an X-ray detector. Propagation-based phase contrast uses free-space propagation to get edge enhancement, Talbot and polychromatic far-field interferometry uses a set of diffraction gratings to measure the derivative of the phase, refraction-enhanced imaging uses an analyzer crystal also for differential measurement, and x-ray interferometry uses a crystal interferometer to measure the phase directly. The advantages of these methods compared to normal absorption-contrast X-ray imaging is higher contrast for low-absorbing materials (because phase shift is a different mechanism than absorption) and a contrast-to-noise relationship that increases with spatial frequency (because many phase-contrast techniques detect the first or second derivative of the phase shift), which makes it possible to see smaller details One disadvantage is that these methods require more sophisticated equipment, such as synchrotron or microfocus X-ray sources, x-ray optics, and high resolution X-ray detectors. This sophisticated equipment provides the sensitivity required to differentiate between small variations in the refractive index of X-rays passing through different media. The refractive index is normally smaller than 1 with a difference from 1 between $$ and $$.

All of these methods produce images that can be used to calculate the projections (integrals) of the refractive index in the imaging direction. For propagation-based phase contrast there are phase-retrieval algorithms, for Talbot interferometry and refraction-enhanced imaging the image is integrated in the proper direction, and for X-ray interferometry phase unwrapping is performed. For this reason they are well suited for tomography, i.e. reconstruction of a 3D-map of the refractive index of the object from many images at slightly different angles. For X-ray radiation the difference from 1 of the refractive index is essentially proportional to the density of the material.

Synchrotron X-ray tomography can employ phase contrast imaging to enable imaging of the interior surfaces of objects. In this context, phase contrast imaging is used to enhance the contrast that would normally be possible from conventional radiographic imaging. A difference in the refractive index between a detail and its surroundings causes a phase shift between the light wave that travels through the detail and that which travels outside the detail. An interference pattern results, marking out the detail.

This method has been used to image Precambrian metazoan embryos from the Doushantuo Formation in China, allowing the internal structure of delicate microfossils to be imaged without destroying the original specimen.

Transmission electron microscopy
In the field of transmission electron microscopy, phase-contrast imaging may be employed to image columns of individual atoms. This ability arises from the fact that the atoms in a material diffract electrons as the electrons pass through them (the relative phases of the electrons change upon transmission through the sample), causing diffraction contrast in addition to the already present contrast in the transmitted beam. Phase-contrast imaging is the highest resolution imaging technique ever developed, and can allow for resolutions of less than one angstrom (less than 0.1 nanometres). It thus enables the direct viewing of columns of atoms in a crystalline material.

The interpretation of phase-contrast images is not a straightforward task. Deconvolving the contrast seen in an HR image to determine which features are due to which atoms in the material can rarely, if ever, be done by eye. Instead, because the combination of contrasts due to multiple diffracting elements and planes and the transmitted beam is complex, computer simulations are used to determine what sort of contrast different structures may produce in a phase-contrast image. Thus, a reasonable amount of information about the sample needs to be understood before a phase contrast image can be properly interpreted, such as a conjecture as to what crystal structure the material has.

Phase-contrast images are formed by removing the objective aperture entirely or by using a very large objective aperture. This ensures that not only the transmitted beam, but also the diffracted ones are allowed to contribute to the image. Instruments that are specifically designed for phase-contrast imaging are often called HRTEMs (high resolution transmission electron microscopes), and differ from analytical TEMs mainly in the design of the electron beam column. Whereas analytical TEMs employ additional detectors attached to the column for spectroscopic measurements, HRTEMs have little or no additional attachments so as to ensure a uniform electromagnetic environment all the way down the column for each beam leaving the sample (transmitted and diffracted). Because phase-contrast imaging relies on differences in phase between electrons leaving the sample, any additional phase shifts that occur between the sample and the viewing screen can make the image impossible to interpret. Thus, a very low degree of lens aberration is also a requirement for HRTEMs, and advances in spherical aberration (Cs) correction have enabled a new generation of HRTEMs to reach resolutions once thought impossible.