Graph entropy

In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.

[1] This measure, first introduced by Körner in the 1970s,[2][3] has since also proven itself useful in other settings, including combinatorics.

be an undirected graph.

is chosen uniformly from

ranges over independent sets of G, the joint distribution of

is the mutual information of

denote the independent vertex sets in

, we wish to find the joint distribution

with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely.

The mutual information of

is then called the entropy of

Additionally, simple formulas exist for certain families classes of graphs.

Here, we use properties of graph entropy to provide a simple proof that a complete graph

vertices cannot be expressed as the union of fewer than

Proof By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by

Thus, by sub-additivity, the union of

bipartite graphs cannot have entropy greater than

By the properties listed above,

Therefore, the union of fewer than

bipartite graphs cannot have the same entropy as

cannot be expressed as such a union.