Zero point (photometry)

In astronomy, the zero point in a photometric system is defined as the magnitude of an object that produces 1 count per second on the detector. The zero point is used to calibrate a system to the standard magnitude system, as the flux detected from stars will vary from detector to detector. Traditionally, Vega is used as the calibration star for the zero point magnitude in specific pass bands (U, B, and V), although often, an average of multiple stars is used for higher accuracy. It is not often practical to find Vega in the sky to calibrate the detector, so for general purposes, any star may be used in the sky that has a known apparent magnitude.

General formula
The equation for the magnitude of an object in a given band is $$M = -2.5\log_{10}\left(\int_0^\infty F(\lambda)\,S\,d\lambda\right) + C,$$ where $M$ is the magnitude of an object, $F$ is the flux at a specific wavelength, and $S$ is the sensitivity function of a given instrument. Under ideal conditions, the sensitivity is 1 inside a pass band and 0 outside a pass band. The constant $C$ is determined from the zero point magnitude using the above equation, by setting the magnitude equal to 0.

Vega as calibration
Under most circumstances, Vega is used as the zero point, but in reality, an elaborate "bootstrap" system is used to calibrate a detector. The calibration typically takes place through extensive observational photometry as well as the use of theoretical atmospheric models.

Bolometric magnitude zero point
While the zero point is defined to be that of Vega for passband filters, there is no defined zero point for bolometric magnitude, and traditionally, the calibrating star has been the sun. However, the IAU has recently defined the absolute bolometric magnitude and apparent bolometric magnitude zero points to be 3.0128×1028 W and 2.51802×10−8 W/m2, respectively.