Kell factor



The Kell factor, named after RCA engineer Raymond D. Kell, is a parameter used to limit the bandwidth of a sampled image signal to avoid the appearance of beat frequency patterns when displaying the image in a discrete display device, usually taken to be 0.7. The number was first measured in 1934 by Raymond D. Kell and his associates as 0.64 but has suffered several revisions given that it is based on image perception, hence subjective, and is not independent of the type of display. It was later revised to 0.85 but can go higher than 0.9, when fixed pixel scanning (e.g., CCD or CMOS) and fixed pixel displays (e.g., LCD or plasma) are used, or as low as 0.7 for electron gun scanning.

From a different perspective, the Kell factor defines the effective resolution of a discrete display device since the full resolution cannot be used without viewing experience degradation. The actual sampled resolution will depend on the spot size and intensity distribution. For electron gun scanning systems, the spot usually has a Gaussian intensity distribution. For CCDs, the distribution is somewhat rectangular, and is also affected by the sampling grid and inter-pixel spacing.

Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing. Interlacing itself does not affect Kell factor, but because interlaced video must be low-pass filtered (i.e., blurred) in the vertical dimension to avoid spatio-temporal aliasing (i.e., flickering effects), the Kell factor of interlaced video is said to be about 70% that of progressive video with the same scan line resolution.

The beat frequency problem
To understand how the distortion comes about, consider an ideal linear process from sampling to display. When a signal is sampled at a frequency that is at least double the Nyquist frequency, it can be fully reconstructed by low-pass filtering since the first repeat spectra does not overlap the original baseband spectra. In discrete displays the image signal is not low-pass filtered since the display takes discrete values as input, i.e. the signal displayed contains all the repeat spectra. The proximity of the highest frequency of the baseband signal to the lowest frequency of the first repeat spectra induces the beat frequency pattern. The pattern seen on screen can at times be similar to a Moiré pattern. The Kell factor is the reduction necessary in signal bandwidth such that no beat frequency is perceived by the viewer.

Examples

 * A 625-line analog (e.g., 50 Hz PAL) television picture is divided into 576 visible lines from top to bottom. Suppose a card featuring horizontal black and white stripes is placed in front of the camera. The effective vertical resolution of the TV system is equal to the largest number of stripes that can be within the picture height and appear as individual stripes. Since it is unlikely the stripes will line up perfectly with the lines on the camera's sensor, the number is slightly less than 576. Using a Kell factor of 0.7, the number can be determined to be 0.7×576 = 403.2 lines of resolution.
 * Kell factor can be used to determine the horizontal resolution that is required to match the vertical resolution attained by a given number of scan lines. For 576i at 50 Hz, given its 4:3 aspect ratio, the required horizontal resolution must be 4/3 times the effective vertical resolution, or (4/3)×0.7×576 = 537.6 pixels per line. Taken further, since 537.6 pixels is equal to a maximum of 268.8 cycles for an alternating pixel pattern, and given 576i 50 Hz has an active line period of 52 μs, its luminance signal requires a bandwidth of 268.8/52 = 5.17 MHz.
 * Kell factor applies equally to digital devices. Using a Kell factor of 0.9, a 1080p HDTV video system using a CCD camera and an LCD or plasma display will only have 1728×972 lines of resolution.