User:Siwen.W/sandbox

Measurement Techniques
Determination of D/H ratio can be performed by a combination of different preparation techniques and instrumentation for different purposes. There are several basic categories that have been used: (i) organic hydrogen or water are converted to H2 first, followed by IRMS (Isotope-ratio mass spectrometry) measurement with high precisions; (ii) D/H and 18O/16O are directly measured as H2O by laser spectroscopy also with high precisions; (iii) the intact molecules are directly measured by NMR or mass spectrometry with relatively lower precision than IRMS.

Offline Combustion/Reduction
The conversion to simple molecules (i.e. H2 for hydrogen) is required prior to IRMS measurement for stable isotope. This is due to several reasons with regard to hydrogen: (i) organic molecules and some inorganic ones (e.g. CO2 + H2O) can have proton-exchange reactions with ion source of mass spectrometer and produce the products such as H217O+ and H316O+ that cannot be distinguished; (ii) isotope effects due to ionization and transmission in the mass spectrometer can be varied with different molecular forms. It would require standards in every different molecular form that is being measured, which is not convenient.

The classical offline preparation for the conversion is combustion over CuO at > 800°C in sealed quartz tubes, followed by the isolation of resulting water and the reduction to H2 over hot metal at 400 ~1000 oC on a vacuum line. The produced gas is then directly injected into the dual-inlet mass spectrometer for measurement. The metals used for the reduction to H2 includes U, Zn, Cr, Mg and Mn, etc. U and Zn had been widely used since 1950s     until Cr was successfully employed in late 1990s.

The offline combustion/reduction has the highest accuracy and precision for hydrogen isotope measurement without limits for sample types. The analytical uncertainty is typically 1~2‰ in δD. Thus it is still being used today when highest levels of precision are required. However, the offline preparation procedure is very time-consuming and complicated. It also requires large sample size (several 102 mg). [''Thus the online preparation based on combustion/reduction coupled with the subsequent continuous flow-IRMS (CF-IRMS) system has been more commonly used nowadays. Chromium reduction or high temperature conversion are the dominant online preparation methods for the detection of hydrogen isotope by IRMS.]''

TC/EA (High temperature conversion/Elemental Analyzer)
TC/EA (or HTC, high temperature conversion; HTP, high temperature pyrolysis; HTCR, high temperature carbon reduction) is an 'online' or 'continuous flow' preparation method typically followed by IRMS detection. This is a bulk technique that measures all of the hydrogen in a given sample and provides the average isotope signal. The weighed sample is placed in a tin or silver capsule and dropped into pyrolysis tube of TC/EA. The tube is made of glassy carbon with glassy carbon filling in which way that oxygen isotope can be measured simultaneously without the oxygen exchange with ceramic (Al2O3) surface. The molecules are then reduced into CO and H2 at high temperature (> 1400 oC) in the reactor. The gaseous products are separated through gas chromatography (GC) using helium as the carrier gas, followed by a split-flow interface, and finally detected by IRMS. TC/EA method can be problematic for the organic compounds with halogen or nitrogen due to the competition between the pyrolysis byproducts (e.g. HCl and HCN) and H2 formation. In addition, it is susceptible to contamination with water, so samples must be scrupulously dried.

An adaption of this method is to determine the non-exchangeable (C-H) and exchangeable hydrogen (bounds to other elements, e.g. O, S and N) in organic matters. The samples are equilibrated with water in sealed autosampler carousels at 115 oC and then transferred into pyrolysis EA followed by IRMS measurement.

TC/EA method is quick with a relatively high precision (~ 1‰). It was limited to solid samples, however, liquid sample recently can also be applied in TC/EA-IRMS system by adapting an autosampler for liquids. The drawback of TC/EA is the relatively big sample size (~ mg), which is smaller than offline combustion/reduction but larger than GC/pyrolysis. It cannot separate different compounds as GC/pyrolysis does and thus only the average for the whole sample can be provided, which is also a drawback for some research.

GC/pyrolysis (Gas chromatography/pyrolysis)
GC-interface (combustion or pyrolysis) is also an online preparation method followed by IRMS detection. This is a 'compound-specific' method, allowing separation of analytes prior to measurement and thus providing information about the isotopic content of each individual compound. Following GC separation, samples are converted to smaller gaseous molecules for isotope detection. GC/pyrolysis uses the pyrolysis interface between GC and IRMS for the conversion of H and O in the molecules into H2 and CO. GC-IRMS was first introduced by Matthews and Hayes in late 1970s. Then GC-interface-IRMS was successively used for the successful measurement for δ13C, δ15N, δ18O and δ32S. Helium has been using as the carrier gas in the GC systems. However, the separation of DH (m/z 3) signal from the tail of 4He+ beam was problematic due to the intense signal of 4He+. During early 1990s, intense efforts were made for solving the difficulties to measure δD by GC/pyrolysis-IRMS. In 1999, Hilkert et al. developed a robust method by integrating the high temperature conversion (TC) into GC-IRMS and adding a pre-cup electrostatic sector and a retardation lens in front of the m/z 3 cup collector. Several different groups were working on this at the same time. This GC/pyrolysis-IRMS based on TC has been widely used for δD measurement nowadays. The commercial products of GC-IRMS include both combustion and pyrolysis interfaces so that δ13C and δD can be measured simultaneously.

The significant advantage of GC/pyrolysis method for hydrogen isotope measurement is that it can separate different compounds in the samples. It requires the smallest sample size (a typical size of ~ 200 ng ) relative to other methods and also has a high precision of 1~5 ‰. But this method is relatively slow and limited to the samples which can be applied in GC system.

Laser spectroscopy
Laser Spectroscopy (or Cavity Ring-Down Spectroscopy, CRDS) is able to directly measure D/H, 17O/16O and 18O/16O isotope compositions in water or methane. The application of laser spectroscopy to hydrogen isotope was first reported by Bergamaschi et al. . They directly measured 12CH3D/12CH4 in atmospheric methane using a lead salt tunable diode laser spectroscopy. The development of CRDS was first reported by O’Keefe et al in 1988. In 1999, Kerstel et al. successfully applied this technique to determine D/H in water sample. The system consists of a laser and a cavity equipped with high finesse reflectivity mirrors. Laser light is injected into the cavity, at which the resonance takes place due to the constructive interference. The laser then is turn off. The decay of light intensity is measured. In the presence of water sample, the photo-absorption by water isotopologues follows the kinetic law. Optical spectrum is obtained by recording ring-down time of the H2O spectral features of interest at certain laser wavelength. The concentration of each isotopologue is proportional to the area under each measured isotopologue spectral feature.

Laser Spectrometry has really quick and simple procedure, relatively lower cost and the equipment is portable. So it can be used in the field for measuring water samples. D/H and 18O/16O can also be determined simultaneously from a single injection. It requires quite a small sample size of < 1 μL for water. The typical precision is ~ 1‰. However, this is the compound-specific instrument, i.e. only one specific compound can be measured. And coexist organic compound (i.e. ethanol) could interfere the optical light absorption features of water, resulting in cross-contamination.