It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take.
Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets.
: Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities.
Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called Shannon-type inequalities.
variables, no finite set of linear inequalities is sufficient to characterize all entropic vectors.
Shannon's information entropy of a random variable
can be understood as the random variable representing the tuple
, is a convex cone and hence characterized by the (infinitely many) linear inequalities it satisfies.
is thus equivalent to characterizing all possible inequalities on joint entropies.
Let X,Y be two independent random variables with discrete uniform distribution over the set
Then (since each is uniformly distributed over a two-element set), and (since the two variables are independent, which means the pair
), because any pair of random variables (independent or not) should satisfy
The Shannon inequality says that an entropic vector is submodular: It is equivalent to the inequality stating that the conditional mutual information is non-negative: (For one direction, observe this the last form expresses Shannon's inequality for subsets
Software has been developed to automate the task of proving Shannon-type inequalities.
The question of whether Shannon-type inequalities are the only ones, that is, whether they completely characterize the region
, was first asked by Te Su Han in 1981[2] and more precisely by Nicholas Pippenger in 1986.
; however, it is still asymptotically true, meaning that the closure is equal:
, by proving that the following inequality on four random variables (in terms of conditional mutual information) is true for any entropic vector, but is not Shannon-type: Further inequalities and infinite families of inequalities have been found.
In 2007, Matus proved that no finite set of linear inequalities is sufficient (to deduce all as linear combinations), for
[11] Whether they can be characterized in some other way (allowing to effectively decide whether a vector is entropic or not) remains an open problem.
Analogous questions for von Neumann entropy in quantum information theory have been considered.
which additionally satisfy the following inequality (and those obtained by permuting variables), known as Ingleton's inequality for entropy:[13] Consider a group
implies that It turns out the converse is essentially true.
More precisely, a vector is said to be group-characterizable if it can be obtained from a tuple of subgroups as above.
Group-characterizable vectors that come from an abelian group satisfy Ingleton's inequality.
Kolmogorov complexity satisfies essentially the same inequalities as entropy.
Namely, denote the Kolmogorov complexity of a finite string
(that is, the length of the shortest program that outputs
(the length of the shortest program that outputs
Andrey Kolmogorov noticed these notions behave similarly to Shannon entropy, for example: In 2000, Hammer et al.[16] proved that indeed an inequality holds for entropic vectors if and only if the corresponding inequality in terms of Kolmogorov complexity holds up to logarithmic terms for all tuples of strings.