History of information theory

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that With it came the ideas of Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory.

Boltzmann argued mathematically that the effect of collisions between the particles would cause the H-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution pi for the complete microstate i of the total system: This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was.

Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible.