Information theory and measure theory

This article discusses how information theory (a branch of mathematics studying the transmission, processing and storage of information) is related to measure theory (a branch of mathematics related to integration and probability).

Many of the concepts in information theory have separate definitions and formulas for continuous and discrete cases.

is usually defined for discrete random variables, whereas for continuous random variables the related concept of differential entropy, written

Both these concepts are mathematical expectations, but the expectation is defined with an integral for the continuous case, and a sum for the discrete case.

These separate definitions can be more closely related in terms of measure theory.

For discrete random variables, probability mass functions can be considered density functions with respect to the counting measure.

Thinking of both the integral and the sum as integration on a measure space allows for a unified treatment.

Consider the formula for the differential entropy of a continuous random variable

, we can write: The integral expression, and the general concept, are identical in the continuous case; the only difference is the measure used.

In both cases the probability density function

exists and the Kullback–Leibler divergence can be expressed in its full generality: where the integral runs over the support of

Note that we have dropped the negative sign: the Kullback–Leibler divergence is always non-negative due to Gibbs' inequality.

There is an analogy between Shannon's basic "measures" of the information content of random variables and a measure over sets.

Namely the joint entropy, conditional entropy, and mutual information can be considered as the measure of a set union, set difference, and set intersection, respectively (Reza pp. 106–108).

If we associate the existence of abstract sets

to arbitrary discrete random variables X and Y, somehow representing the information borne by X and Y, respectively, such that: where

is a signed measure over these sets, and we set: we find that Shannon's "measure" of information content satisfies all the postulates and basic properties of a formal signed measure over sets, as commonly illustrated in an information diagram.

This allows the sum of two measures to be written: and the analog of Bayes' theorem (

) allows the difference of two measures to be written: This can be a handy mnemonic device in some situations, e.g.

Note that measures (expectation values of the logarithm) of true probabilities are called "entropy" and generally represented by the letter H, while other measures are often referred to as "information" or "correlation" and generally represented by the letter I.

For notational simplicity, the letter I is sometimes used for all measures.

Certain extensions to the definitions of Shannon's basic measures of information are necessary to deal with the σ-algebra generated by the sets that would be associated to three or more arbitrary random variables.

needs to be defined in the obvious way as the entropy of a joint distribution, and a multivariate mutual information

defined in a suitable manner so that we can set: in order to define the (signed) measure over the whole σ-algebra.

There is no single universally accepted definition for the multivariate mutual information, but the one that corresponds here to the measure of a set intersection is due to Fano (1966: p. 57-59).

As a base case the mutual information of a single random variable is defined to be its entropy:

we set where the conditional mutual information is defined as The first step in the recursion yields Shannon's definition

The multivariate mutual information (same as interaction information but for a change in sign) of three or more random variables can be negative as well as positive: Let X and Y be two independent fair coin flips, and let Z be their exclusive or.

is the mutual information of the joint distribution of X and Y relative to Z, and can be interpreted as

Many more complicated expressions can be built this way, and still have meaning, e.g.

Venn diagram for various information measures associated with correlated variables X and Y . The area contained by both circles is the joint entropy H ( X , Y ). The circle on the left (red and cyan) is the individual entropy H ( X ), with the red being the conditional entropy H ( X | Y ). The circle on the right (blue and cyan) is H ( Y ), with the blue being H ( Y | X ). The cyan is the mutual information I ( X ; Y ).
Venn diagram of information theoretic measures for three variables x , y , and z . Each circle represents an individual entropy : H ( x ) is the lower left circle, H ( y ) the lower right, and H ( z ) is the upper circle. The intersections of any two circles represents the mutual information for the two associated variables (e.g. I ( x ; z ) is yellow and gray). The union of any two circles is the joint entropy for the two associated variables (e.g. H ( x , y ) is everything but green). The joint entropy H ( x , y , z ) of all three variables is the union of all three circles. It is partitioned into 7 pieces, red, blue, and green being the conditional entropies H ( x | y , z ), H ( y | x , z ), H ( z | x , y ) respectively, yellow, magenta and cyan being the conditional mutual informations I ( x ; z | y ), I ( y ; z | x ) and I ( x ; y | z ) respectively, and gray being the multivariate mutual information I ( x ; y ; z ). The multivariate mutual information is the only one of all that may be negative.