In information theory, Shannon–Fano–Elias coding is a precursor to arithmetic coding, in which probabilities are used to determine codewords.
[1] It is named for Claude Shannon, Robert Fano, and Peter Elias.
Given a discrete random variable X of ordered values to be encoded, let
Define a function Algorithm: Let X = {A, B, C, D}, with probabilities p = {1/3, 1/4, 1/6, 1/4}.
Let bcode(x) be the rational number formed by adding a decimal point before a binary code.
By comparing F to the CDF of X, this property may be demonstrated graphically for Shannon–Fano–Elias coding.
By definition of L it follows that And because the bits after L(y) are truncated from F(y) to form code(y), it follows that thus bcode(y) must be no less than CDF(x).
Thus for H(X), the entropy of the random variable X, Shannon Fano Elias codes from 1 to 2 extra bits per symbol from X than entropy, so the code is not used in practice.