However, this has the disadvantage that the choice it makes might require more bits while giving comparatively little quality benefit.
Rate-distortion optimization solves the aforementioned problem by acting as a video quality metric, measuring both the deviation from the source material and the bit cost for each possible decision outcome.
The deviation from the source is usually measured as the mean squared error, in order to maximize the PSNR video quality metric.
Calculating the bit cost is made more difficult by the entropy encoders in modern video codecs, requiring the rate-distortion optimization algorithm to pass each block of video to be tested to the entropy coder to measure its actual bit cost.
In MPEG codecs, the full process consists of a discrete cosine transform, followed by quantization and entropy encoding.