Joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.

[2] The joint Shannon entropy (in bits) of two discrete random variables

with images

is defined as[3]: 16 where

are particular values of

is the joint probability of these values occurring together, and

log

is defined to be 0 if

For more than two random variables

this expands to where

are particular values of

is the probability of these values occurring together, and

log

is defined to be 0 if

The joint entropy of a set of random variables is a nonnegative number.

The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies of the variables in the set.

The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables in the set.

This is an example of subadditivity.

This inequality is an equality if and only if

are statistically independent.

[3]: 30 Joint entropy is used in the definition of conditional entropy[3]: 22 and It is also used in the definition of mutual information[3]: 21 In quantum information theory, the joint entropy is generalized into the joint quantum entropy.

The above definition is for discrete random variables and just as valid in the case of continuous random variables.

The continuous version of discrete joint entropy is called joint differential (or continuous) entropy.

be a continuous random variables with a joint probability density function

The differential joint entropy

is defined as[3]: 249 For more than two continuous random variables

the definition is generalized to: The integral is taken over the support of

It is possible that the integral does not exist in which case we say that the differential entropy is not defined.

As in the discrete case the joint differential entropy of a set of random variables is smaller or equal than the sum of the entropies of the individual random variables: The following chain rule holds for two random variables: In the case of more than two random variables this generalizes to:[3]: 253 Joint differential entropy is also used in the definition of the mutual information between continuous random variables:

A misleading [ 1 ] Venn diagram showing additive, and subtractive relationships between various information measures associated with correlated variables X and Y. The area contained by both circles is the joint entropy H(X,Y). The circle on the left (red and violet) is the individual entropy H(X), with the red being the conditional entropy H(X|Y). The circle on the right (blue and violet) is H(Y), with the blue being H(Y|X). The violet is the mutual information I(X;Y).