Effective number of bits

Effective number of bits (ENOB) is a measure of the dynamic range of an analog-to-digital converter (ADC), digital-to-analog converter, or their associated circuitry. The resolution of an ADC is specified by the number of bits used to represent the analog value. Ideally, a 12-bit ADC will have an effective number of bits of almost 12. However, real signals have noise, and real circuits are imperfect and introduce additional noise and distortion. Those imperfections reduce the number of bits of accuracy in the ADC. The ENOB describes the effective resolution of the system in bits. An ADC may have a 12-bit resolution, but the effective number of bits, when used in a system, may be 9.5.

ENOB is also used as a quality measure for other blocks such as sample-and-hold amplifiers. Thus analog blocks may be included in signal-chain calculations. The total ENOB of a chain of blocks is usually less than the ENOB of the worst block.

The frequency band of a signal converter where ENOB is still guaranteed is called the effective resolution bandwidth and is limited by dynamic quantization problems. For example, an ADC has some aperture uncertainty. The instant a real ADC samples, its input varies from sample to sample. Because the input signal changes, that time variation translates to an output variation. For example, an ADC may sample 1 ns late. If the input signal is a 1 V sinewave at 1,000,000 radians/second (roughly 160 kHz), the input voltage may change by as much as 1 MV/s. A sampling time error of 1 ns would cause a sampling error of about 1 mV (an error in the 10th bit). If the frequency were 100 times faster (about 16 MHz), then the maximum error would be 100 times greater: about 100 mV on a 1 V signal (an error in the third or fourth bit).

Definition
An often used definition for ENOB is
 * $$\mathrm{ENOB} = \frac{\mathrm{SINAD} - 1.76}{6.02},$$

where
 * ENOB is given in bits
 * SINAD (signal, noise, and distortion) is a power ratio indicating the quality of the signal in dB.
 * the 6.02 term in the divisor converts decibels (a log10 representation) to bits (a log2 representation),
 * the 1.76 term comes from quantization error in an ideal ADC.

This definition compares the SINAD of an ideal ADC or DAC with a word length of ENOB bits with the SINAD of the ADC or DAC being tested.