Shannon (unit)

It is understood as such within the realm of information theory, and is conceptually distinct from the bit, a term used in data processing and storage to denote a single instance of a binary signal.

Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data,[2] whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing.

Although of a more limited nature, his early work, preceding that of Shannon, makes him recognized also as a pioneer of information theory.

In information theory and derivative fields such as coding theory, one cannot quantify the 'information' in a single message (sequence of symbols) out of context, but rather a reference is made to the model of a channel (such as bit error rate) or to the underlying statistics of an information source.

Although there are infinite possibilities for a real number chosen between 0 and 1, so-called differential entropy can be used to quantify the information content of an analog signal, such as related to the enhancement of signal-to-noise ratio or confidence of a hypothesis test.