Apparent magnitude



Apparent magnitude ($m$) is a measure of the brightness of a star or other astronomical object. An object's apparent magnitude depends on its intrinsic luminosity, its distance, and any extinction of the object's light caused by interstellar dust along the line of sight to the observer.

The word magnitude in astronomy, unless stated otherwise, usually refers to a celestial object's apparent magnitude. The magnitude scale dates to before the ancient Roman astronomer Claudius Ptolemy, whose star catalog popularized the system by listing stars from 1st magnitude (brightest) to 6th magnitude (dimmest). The modern scale was mathematically defined in a way to closely match this historical system.

The scale is reverse logarithmic: the brighter an object is, the lower its magnitude number. A difference of 1.0 in magnitude corresponds to a brightness ratio of $$\sqrt[5]{100}$$, or about 2.512. For example, a star of magnitude 2.0 is 2.512 times as bright as a star of magnitude 3.0, 6.31 times as bright as a star of magnitude 4.0, and 100 times as bright as one of magnitude 7.0.

The brightest astronomical objects have negative apparent magnitudes: for example, Venus at −4.2 or Sirius at −1.46. The faintest stars visible with the naked eye on the darkest night have apparent magnitudes of about +6.5, though this varies depending on a person's eyesight and with altitude and atmospheric conditions. The apparent magnitudes of known objects range from the Sun at −26.832 to objects in deep Hubble Space Telescope images of magnitude +31.5.

The measurement of apparent magnitude is called photometry. Photometric measurements are made in the ultraviolet, visible, or infrared wavelength bands using standard passband filters belonging to photometric systems such as the UBV system or the Strömgren uvbyβ system. Measurement in the V-band may be referred to as the apparent visual magnitude.

Absolute magnitude is a related quantity which measures the luminosity that a celestial object emits, rather than its apparent brightness when observed, and is expressed on the same reverse logarithmic scale. Absolute magnitude is defined as the apparent magnitude that a star or object would have if it were observed from a distance of 10 parsec. Therefore, it is of greater use in stellar astrophysics since it refers to a property of a star regardless of how close it is to Earth. But in observational astronomy and popular stargazing, references to "magnitude" are understood to mean apparent magnitude.

Amateur astronomers commonly express the darkness of the sky in terms of limiting magnitude, i.e. the apparent magnitude of the faintest star they can see with the naked eye. This can be useful as a way of monitoring the spread of light pollution.

Apparent magnitude is really a measure of illuminance, which can also be measured in photometric units such as lux.

History
The scale used to indicate magnitude originates in the Hellenistic practice of dividing stars visible to the naked eye into six magnitudes. The brightest stars in the night sky were said to be of first magnitude ($1,602$ = 1), whereas the faintest were of sixth magnitude ($4,800$ = 6), which is the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered twice the brightness of the following grade (a logarithmic scale), although that ratio was subjective as no photodetectors existed. This rather crude scale for the brightness of stars was popularized by Ptolemy in his Almagest and is generally believed to have originated with Hipparchus. This cannot be proved or disproved because Hipparchus's original star catalogue is lost. The only preserved text by Hipparchus himself (a commentary to Aratus) clearly documents that he did not have a system to describe brightness with numbers: He always uses terms like "big" or "small", "bright" or "faint" or even descriptions such as "visible at full moon".

In 1856, Norman Robert Pogson formalized the system by defining a first magnitude star as a star that is 100 times as bright as a sixth-magnitude star, thereby establishing the logarithmic scale still in use today. This implies that a star of magnitude $9,100$ is about 2.512 times as bright as a star of magnitude $m + 1$. This figure, the fifth root of 100, became known as Pogson's Ratio. The 1884 Harvard Photometry and 1886 Potsdamer Duchmusterung star catalogs popularized Pogson's ratio, and eventually it became a de facto standard in modern astronomy to describe differences in brightness.

Defining and calibrating what magnitude 0.0 means is difficult, and different types of measurements which detect different kinds of light (possibly by using filters) have different zero points. Pogson's original 1856 paper defined magnitude 6.0 to be the faintest star the unaided eye can see, but the true limit for faintest possible visible star varies depending on the atmosphere and how high a star is in the sky. The Harvard Photometry used an average of 100 stars close to Polaris to define magnitude 5.0. Later, the Johnson UVB photometric system defined multiple types of photometric measurements with different filters, where magnitude 0.0 for each filter is defined to be the average of six stars with the same spectral type as Vega. This was done so the color index of these stars would be 0. Although this system is often called "Vega normalized", Vega is slightly dimmer than the six-star average used to define magnitude 0.0, meaning Vega's magnitude is normalized to 0.03 by definition.

With the modern magnitude systems, brightness is described using Pogson's ratio. In practice magnitude numbers rarely go above 30 before stars become too faint to detect. While Vega is close to magnitude 0, there are four brighter stars in the night sky at visible wavelengths (and more at infrared wavelengths) as well as the bright planets Venus, Mars, and Jupiter, and since brighter means smaller magnitude, these must be described by negative magnitudes. For example, Sirius, the brightest star of the celestial sphere, has a magnitude of −1.4 in the visible. Negative magnitudes for other very bright astronomical objects can be found in the table below.

Astronomers have developed other photometric zero point systems as alternatives to Vega normalized systems. The most widely used is the AB magnitude system, in which photometric zero points are based on a hypothetical reference spectrum having constant flux per unit frequency interval, rather than using a stellar spectrum or blackbody curve as the reference. The AB magnitude zero point is defined such that an object's AB and Vega-based magnitudes will be approximately equal in the V filter band. However, the AB magnitude system is defined assuming an idealized detector measuring only one wavelength of light, while real detectors accept energy from a range of wavelengths.

Measurement
Precision measurement of magnitude (photometry) requires calibration of the photographic or (usually) electronic detection apparatus. This generally involves contemporaneous observation, under identical conditions, of standard stars whose magnitude using that spectral filter is accurately known. Moreover, as the amount of light actually received by a telescope is reduced due to transmission through the Earth's atmosphere, the airmasses of the target and calibration stars must be taken into account. Typically one would observe a few different stars of known magnitude which are sufficiently similar. Calibrator stars close in the sky to the target are favoured (to avoid large differences in the atmospheric paths). If those stars have somewhat different zenith angles (altitudes) then a correction factor as a function of airmass can be derived and applied to the airmass at the target's position. Such calibration obtains the brightness as would be observed from above the atmosphere, where apparent magnitude is defined.

The apparent magnitude scale in astronomy reflects the received power of stars and not their amplitude. Newcomers should consider using the relative brightness measure in astrophotography to adjust exposure times between stars. Apparent magnitude also integrates over the entire object, regardless of its focus, and this needs to be taken into account when scaling exposure times for objects with significant apparent size, like the Sun, Moon and planets. For example, directly scaling the exposure time from the Moon to the Sun works because they are approximately the same size in the sky. However, scaling the exposure from the Moon to Saturn would result in an overexposure if the image of Saturn takes up a smaller area on your sensor than the Moon did (at the same magnification, or more generally, f/#).

Calculations


The dimmer an object appears, the higher the numerical value given to its magnitude, with a difference of 5 magnitudes corresponding to a brightness factor of exactly 100. Therefore, the magnitude $14,000$, in the spectral band $42,000$, would be given by $$m_{x}= -5 \log_{100} \left(\frac {F_x}{F_{x,0}}\right),$$ which is more commonly expressed in terms of common (base-10) logarithms as $$m_{x} = -2.5 \log_{10} \left(\frac {F_x}{F_{x,0}}\right),$$ where $121,000$ is the observed irradiance using spectral filter $340,000$, and $F_{x,0}$ is the reference flux (zero-point) for that photometric filter. Since an increase of 5 magnitudes corresponds to a decrease in brightness by a factor of exactly 100, each magnitude increase implies a decrease in brightness by the factor $$\sqrt[5]{100} \approx 2.512$$ (Pogson's ratio). Inverting the above formula, a magnitude difference $m_{1} − m_{2} = Δm$ implies a brightness factor of $$ \frac{F_2}{F_1} = 100^\frac{\Delta m}{5} = 10^{0.4 \Delta m} \approx 2.512^{\Delta m}.$$

Example: Sun and Moon
What is the ratio in brightness between the Sun and the full Moon?

The apparent magnitude of the Sun is −26.832 (brighter), and the mean magnitude of the full moon is −12.74 (dimmer).

Difference in magnitude: $$ x = m_1 - m_2 = (-12.74) - (-26.832) = 14.09. $$

Brightness factor: $$ v_b = 10^{0.4 x} = 10^{0.4 \times 14.09} \approx 432\,513. $$

The Sun appears to be approximately $m$ times as bright as the full Moon.

Magnitude addition
Sometimes one might wish to add brightness. For example, photometry on closely separated double stars may only be able to produce a measurement of their combined light output. To find the combined magnitude of that double star knowing only the magnitudes of the individual components, this can be done by adding the brightness (in linear units) corresponding to each magnitude. $$ 10^{-m_f \times 0.4} = 10^{-m_1 \times 0.4} + 10^{-m_2 \times 0.4}. $$

Solving for $$m_f$$ yields $$ m_f = -2.5\log_{10} \left(10^{-m_1 \times 0.4} + 10^{-m_2 \times 0.4} \right), $$ where $m$ is the resulting magnitude after adding the brightnesses referred to by $m_{1}$ and $m_{2}$.

Apparent bolometric magnitude
While magnitude generally refers to a measurement in a particular filter band corresponding to some range of wavelengths, the apparent or absolute bolometric magnitude (mbol) is a measure of an object's apparent or absolute brightness integrated over all wavelengths of the electromagnetic spectrum (also known as the object's irradiance or power, respectively). The zero point of the apparent bolometric magnitude scale is based on the definition that an apparent bolometric magnitude of 0 mag is equivalent to a received irradiance of 2.518×10−8 watts per square metre (W·m−2).

Absolute magnitude
While apparent magnitude is a measure of the brightness of an object as seen by a particular observer, absolute magnitude is a measure of the intrinsic brightness of an object. Flux decreases with distance according to an inverse-square law, so the apparent magnitude of a star depends on both its absolute brightness and its distance (and any extinction). For example, a star at one distance will have the same apparent magnitude as a star four times as bright at twice that distance. In contrast, the intrinsic brightness of an astronomical object, does not depend on the distance of the observer or any extinction.

The absolute magnitude $m$, of a star or astronomical object is defined as the apparent magnitude it would have as seen from a distance of 10 pc. The absolute magnitude of the Sun is 4.83 in the V band (visual), 4.68 in the Gaia satellite's G band (green) and 5.48 in the B band (blue).

In the case of a planet or asteroid, the absolute magnitude $m$ rather means the apparent magnitude it would have if it were 1 AU from both the observer and the Sun, and fully illuminated at maximum opposition (a configuration that is only theoretically achievable, with the observer situated on the surface of the Sun).

Standard reference values
The magnitude scale is a reverse logarithmic scale. A common misconception is that the logarithmic nature of the scale is because the human eye itself has a logarithmic response. In Pogson's time this was thought to be true (see Weber–Fechner law), but it is now believed that the response is a power law.

Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range in daylight). The V band was chosen for spectral purposes and gives magnitudes closely corresponding to those seen by the human eye. When an apparent magnitude is discussed without further qualification, the V magnitude is generally understood.

Because cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV regions of the spectrum, their power is often under-represented by the UBV scale. Indeed, some L and T class stars have an estimated magnitude of well over 100, because they emit extremely little visible light, but are strongest in infrared.

Measures of magnitude need cautious treatment and it is extremely important to measure like with like. On early 20th century and older orthochromatic (blue-sensitive) photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse irregular variable star (at maximum) are reversed compared to what human eyes perceive, because this archaic film is more sensitive to blue light than it is to red light. Magnitudes obtained from this method are known as photographic magnitudes, and are now considered obsolete.

For objects within the Milky Way with a given absolute magnitude, 5 is added to the apparent magnitude for every tenfold increase in the distance to the object. For objects at very great distances (far beyond the Milky Way), this relationship must be adjusted for redshifts and for non-Euclidean distance measures due to general relativity.

For planets and other Solar System bodies, the apparent magnitude is derived from its phase curve and the distances to the Sun and observer.

List of apparent magnitudes
Some of the listed magnitudes are approximate. Telescope sensitivity depends on observing time, optical bandpass, and interfering light from scattering and airglow.