The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form: where
Common values of b are 2, Euler's number e, and 10, and the unit of entropy is shannon (or bit) for b = 2, nat for b = e, and hartley for b = 10.
It can be shown that the Gibbs entropy formula, with the natural logarithm, reproduces all of the properties of the macroscopic classical thermodynamics of Rudolf Clausius.
For a simple compressible system that can only perform volume work, the first law of thermodynamics becomes But one can equally well write this equation in terms of what physicists and chemists sometimes call the 'reduced' or dimensionless entropy, σ = S/k, so that Just as S is conjugate to T, so σ is conjugate to kBT (the energy that is characteristic of T on a molecular scale).
To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the average of the minimum number of yes–no questions needed to be answered in order to fully specify the microstate, given that we know the macrostate.
Furthermore, the prescription to find the equilibrium distributions of statistical mechanics—such as the Boltzmann distribution—by maximising the Gibbs entropy subject to appropriate constraints (the Gibbs algorithm) can be seen as something not unique to thermodynamics, but as a principle of general relevance in statistical inference, if it is desired to find a maximally uninformative probability distribution, subject to certain constraints on its averages.
[4] Temperature is a measure of the average kinetic energy per particle in an ideal gas (kelvins = 2/3 joules/kB) so the J/K units of kB is dimensionless (joule/joule).
If kinetic energy measurements per particle of an ideal gas were expressed as joules instead of kelvins, kb in the above equations would be replaced by 3/2.
This thought experiment has been physically demonstrated, using a phase-contrast microscope equipped with a high speed camera connected to a computer, acting as the demon.
This can only be achieved under information-preserving microscopically deterministic dynamics if the uncertainty is somehow dumped somewhere else – i.e. if the entropy of the environment (or the non information-bearing degrees of freedom) is increased by at least an equivalent amount, as required by the Second Law, by gaining an appropriate quantity of heat: specifically kT ln(2) of heat for every 1 bit of randomness erased.
It is only logically irreversible operations – for example, the erasing of a bit to a known state, or the merging of two computation paths – which must be accompanied by a corresponding entropy increase.
In 1953, Brillouin derived a general equation[10] stating that the changing of an information bit value requires at least kT ln(2) energy.
This is the same energy as the work Leo Szilard's engine produces in the idealistic case, which in turn equals the same quantity found by Landauer.
In his book,[11] he further explored this problem concluding that any cause of a bit value change (measurement, decision about a yes/no question, erasure, display, etc.)
Setting up a bit of information in a sub-system originally in thermal equilibrium results in a local entropy reduction.
In this way, Brillouin clarified the meaning of negentropy which was considered as controversial because its earlier understanding can yield Carnot efficiency higher than one.
Additionally, the relationship between energy and information formulated by Brillouin has been proposed as a connection between the amount of bits that the brain processes and the energy it consumes: Collell and Fauquet [12] argued that De Castro [13] analytically found the Landauer limit as the thermodynamic lower bound for brain computations.
However, even though evolution is supposed to have "selected" the most energetically efficient processes, the physical lower bounds are not realistic quantities in the brain.
Firstly, because the minimum processing unit considered in physics is the atom/molecule, which is distant from the actual way that brain operates; and, secondly, because neural networks incorporate important redundancy and noise factors that greatly reduce their efficiency.
[14] Laughlin et al. [15] was the first to provide explicit quantities for the energetic cost of processing sensory information.
Their findings in blowflies revealed that for visual sensory data, the cost of transmitting one bit of information is around 5 × 10−14 Joules, or equivalently 104 ATP molecules.
In 2009, Mahulikar & Herwig redefined thermodynamic negentropy as the specific entropy deficit of the dynamically ordered sub-system relative to its surroundings.
[16] This definition enabled the formulation of the Negentropy Principle, which is mathematically shown to follow from the 2nd Law of Thermodynamics, during order existence.
Hirschman uncertainty, that Heisenberg's uncertainty principle can be expressed as a particular lower bound on the sum of the classical distribution entropies of the quantum observable probability distributions of a quantum mechanical state, the square of the wave-function, in coordinate, and also momentum space, when expressed in Planck units.