Shannon capacity of a graph

It measures the Shannon capacity of a communications channel defined from the graph, and is upper bounded by the Lovász number, which can be computed in polynomial time.

However, the computational complexity of the Shannon capacity itself remains unknown.

The Shannon capacity models the amount of information that can be transmitted across a noisy communication channel in which certain signal values can be confused with each other.

For instance, suppose that a communications channel has five discrete signal values, any one of which can be transmitted in a single time step.

These values may be modeled mathematically as the five numbers 0, 1, 2, 3, or 4 in modular arithmetic modulo 5.

represents the noise on the channel and may be any real number in the open interval from −1 to 1.

Even though the channel has five values that can be sent per time step, effectively only two of them can be used with this coding scheme.

For instance, suppose that in two consecutive steps the sender transmits one of the five code words "11", "23", "35", "54", or "42".

(Here, the quotation marks indicate that these words should be interpreted as strings of symbols, not as decimal numbers.)

The effective number of values that can be transmitted per unit time step is

In graph-theoretic terms, this means that the Shannon capacity of the 5-cycle is at least

As Lovász (1979) showed, this bound is tight: it is not possible to find a more complicated system of code words that allows even more different messages to be sent in the same amount of time, so the Shannon capacity of the 5-cycle is exactly

is an independent set in the graph, a subset of vertices that does not include both endpoints of any edge.

The maximum possible size of a subset of the symbols that can all be distinguished from each other is the independence number

of the graph, the size of its maximum independent set.

For codewords of longer lengths, one can use independent sets in larger graphs to describe the sets of codewords that can be transmitted without confusion.

In this graph, each vertex has eight neighbors, the eight strings that it can be confused with.

A subset of length-two strings forms a code with no possible confusion if and only if it corresponds to an independent set of this graph.

is a graph representing the signals and confusable pairs of a channel, then the graph representing the length-two codewords and their confusable pairs is

The effective number of signals transmitted per unit time step is the

(a cycle graph of seven vertices) remains unknown.

[2][3] A natural approach to this problem would be to compute a finite number of powers of the given graph

However (even ignoring the computational difficulty of computing the independence numbers of these graphs, an NP-hard problem) the unpredictable behavior of the sequence of independence numbers of powers of

implies that this approach cannot be used to accurately approximate the Shannon capacity.

[4] In part because the Shannon capacity is difficult to compute, researchers have looked for other graph invariants that are easy to compute and that provide bounds on the Shannon capacity.

The Lovász number ϑ(G) is a different graph invariant, that can be computed numerically to high accuracy in polynomial time by an algorithm based on the ellipsoid method.

The Shannon capacity of a graph G is bounded from below by α(G), and from above by ϑ(G).

[5] In some cases, ϑ(G) and the Shannon capacity coincide; for instance, for the graph of a pentagon, both are equal to √5.

However, there exist other graphs for which the Shannon capacity and the Lovász number differ.

[6] Haemers provided another upper bound on the Shannon capacity, which is sometimes better than Lovász bound:[7] where B is an n × n matrix over some field, such that bii ≠ 0 and bij = 0 if vertices i and j are not adjacent.

A five-vertex cycle, modeling a set of five values that can be transmitted across a noisy communications channel and the pairs of values that can be confused with each other