Boltzmann's entropy formula

In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or molecules of a certain kind of thermodynamic system can be arranged.

The equation was originally formulated by Ludwig Boltzmann between 1872 and 1875, but later put into its current form by Max Planck in about 1900.

[2][3] To quote Planck, "the logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".

The value of W was originally intended to be proportional to the Wahrscheinlichkeit (the German word for probability) of a macroscopic state for some probability distribution of possible microstates—the collection of (unobservable microscopic single particle) "ways" in which the (observable macroscopic) thermodynamic state of a system can be realized by assigning different positions and momenta to the respective molecules.

For a given macrostate, he called the collection of all possible instantaneous microstates of a certain kind by the name monode, for which Gibbs' term ensemble is used nowadays.

For single particle instantaneous microstates, Boltzmann called the collection an ergode.

Boltzmann's paradigm was an ideal gas of N identical particles, of which Ni are in the i-th microscopic condition (range) of position and momentum.

Boltzmann writes: “The first task is to determine the permutation number, previously designated by

is a product, it is easiest to determine the minimum of its logarithm, …” Therefore, by making the denominator small, he maximizes the number of states.

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles.

This is exact for an ideal gas of identical particles that move independently apart from instantaneous collisions, and is an approximation, possibly a poor one, for other systems.

[9] The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic system as statistically independent.

The probability distribution of the system as a whole then factorises into the product of N separate identical terms, one term for each particle; and when the summation is taken over each possible state in the 6-dimensional phase space of a single particle (rather than the 6N-dimensional phase space of the system as a whole), the Gibbs entropy simplifies to the Boltzmann entropy

This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872.

For the special case of an ideal gas it exactly corresponds to the proper thermodynamic entropy.

leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules.

Boltzmann's equation —carved on his gravestone. [ 1 ]
Boltzmann's grave in the Zentralfriedhof , Vienna, with bust and entropy formula.