Kell factor

The Kell factor, named after RCA engineer Raymond D. Kell,[1] is a parameter used to limit the bandwidth of a sampled image signal to avoid the appearance of beat frequency patterns when displaying the image in a distinct display device, usually taken to be 0.7.

The number was first measured in 1934 by Raymond D. Kell and his associates as 0.64 but has suffered several revisions given that it is based on image perception, hence subjective, and is not independent of the type of display.

For charged coupled devices, the distribution is somewhat rectangular, and is also affected by the sampling grid and inter-pixel spacing.

Kell factor is sometimes incorrectly stated to exist to account for the effects of interlacing.

The Kell factor is the reduction necessary in signal bandwidth such that no beat frequency is perceived by the viewer.

At 0.5 cycles/pixel, the Nyquist limit, signal amplitude depends on phase, as visible by the three medium-gray curves where the signal goes 90° out of phase with the pixels.
At 0.33 cycles/pixel, 0.66 times the Nyquist limit, amplitude can largely be maintained regardless of phase. Some artifacts are still visible, but minor.