In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency of the quantity H (defined below) to decrease in a nearly-ideal gas of molecules.
It is thought to prove the second law of thermodynamics,[2][3][4] albeit under the assumption of low-entropy initial conditions.
The H-theorem has led to considerable discussion about its actual implications,[6] with major themes being: Boltzmann in his original publication writes the symbol E (as in entropy) for its statistical function.
[1] Years later, Samuel Hawksley Burbury, one of the critics of the theorem,[7] wrote the function with the symbol H,[8] a notation that was subsequently adopted by Boltzmann when referring to his "H-theorem".
[10] Discussions have been raised on how the symbol should be understood, but it remains unclear due to the lack of written sources from the time of the theorem.
H itself is defined as For an isolated ideal gas (with fixed total energy and fixed total number of particles), the function H is at a minimum when the particles have a Maxwell–Boltzmann distribution; if the molecules of the ideal gas are distributed in some other way (say, all having the same kinetic energy), then the value of H will be higher.
From this kinetic equation, a natural outcome is that the continual process of collision causes the quantity H to decrease until it has reached a minimum.
In many cases the molecular chaos assumption is highly accurate, and the ability to discard complex correlations between particles makes calculations much simpler.
As Boltzmann would eventually go on to admit, the arrow of time in the H-theorem is not in fact purely mechanical, but really a consequence of assumptions about initial conditions.
The explanation is that Boltzmann's equation is based on the assumption of "molecular chaos", i.e., that it follows from, or at least is consistent with, the underlying kinetic model that the particles be considered independent and uncorrelated.
It turns out that this assumption breaks time reversal symmetry in a subtle sense, and therefore begs the question.
As a demonstration of Loschmidt's paradox, a modern counterexample (not to Boltzmann's original gas-related H-theorem, but to a closely related analogue) is the phenomenon of spin echo.
In the experiment, the spin system is initially perturbed into a non-equilibrium state (high H), and, as predicted by the H theorem the quantity H soon decreases to the equilibrium value.
At some point, a carefully constructed electromagnetic pulse is applied that reverses the motions of all the spins.
In some sense, the time reversed states noted by Loschmidt turned out to be not completely impractical.
The second law of thermodynamics states that the entropy of an isolated system always increases to a maximum equilibrium value.
These fluctuations are only perceptible when the system is small and the time interval over which it is observed is not enormously large.
An additional way to calculate the quantity H is: where P is the probability of finding a system chosen at random from the specified microcanonical ensemble.
For a system of N statistically independent particles, H is related to the thermodynamic entropy S through:[23] So, according to the H-theorem, S can only increase.
In quantum statistical mechanics (which is the quantum version of classical statistical mechanics), the H-function is the function:[24] where summation runs over all possible distinct states of the system, and pi is the probability that the system could be found in the i-th state.
For an isolated system the jumps will make contributions where the reversibility of the dynamics ensures that the same transition constant ναβ appears in both expressions.
The same mathematics is sometimes used to show that relative entropy is a Lyapunov function of a Markov process in detailed balance, and other chemistry contexts.
Josiah Willard Gibbs described another way in which the entropy of a microscopic system would tend to increase over time.
For almost any kind of realistic system, the Liouville evolution tends to "stir" the ensemble over phase space, a process analogous to the mixing of a dye in an incompressible fluid.
Liouville's equation is guaranteed to conserve Gibbs entropy since there is no random process acting on the system; in principle, the original ensemble can be recovered at any time by reversing the motion.
Or, if the system experiences a tiny uncontrolled interaction with its environment, the sharp coherence of the ensemble will be lost.
Edwin Thompson Jaynes argued that the blurring is subjective in nature, simply corresponding to a loss of knowledge about the state of the system.
[27] In any case, however it occurs, the Gibbs entropy increase is irreversible provided the blurring cannot be reversed.
To the extent that one accepts that the ensemble becomes blurred, then, Gibbs' approach is a cleaner proof of the second law of thermodynamics.
[29] In quantum mechanics, the ensemble cannot support an ever-finer mixing process, because of the finite dimensionality of the relevant portion of Hilbert space.