The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible.
The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.
Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states (microstates) of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties, which constitute the macrostate of the system.
The easily measurable parameters volume, pressure, and temperature of the gas describe its macroscopic condition (state).
At a microscopic level, the gas consists of a vast number of freely moving atoms or molecules, which randomly collide with one another and with the walls of the container.
The large number of particles of the gas provides an infinite number of possible microstates for the sample, but collectively they exhibit a well-defined average of configuration, which is exhibited as the macrostate of the system, to which each individual microstate contribution is negligibly small.
Therefore, the system can be described as a whole by only a few macroscopic parameters, called the thermodynamic variables: the total energy E, volume V, pressure P, temperature T, and so forth.
Equilibrium may be illustrated with a simple example of a drop of food coloring falling into a glass of water.
However, after sufficient time has passed, the system reaches a uniform color, a state much easier to describe and explain.
Boltzmann formulated a simple relationship between entropy and the number of possible microstates of a system, which is denoted by the symbol Ω.
Now, Σi d(Ei pi) is the expectation value of the change in the total energy of the system.
If the changes are sufficiently slow, so that the system remains in the same microscopic state, but the state slowly (and reversibly) changes, then Σi (dEi) pi is the expectation value of the work done on the system through this reversible process, dwrev.
The set of microstates (with probability distribution) on which the sum is done is called a statistical ensemble.
Neglecting correlations (or, more generally, statistical dependencies) between the states of individual particles will lead to an incorrect probability distribution on the microstates and hence to an overestimate of the entropy.
To illustrate this idea, consider a set of 100 coins, each of which is either heads up or tails up.
For the macrostates of 100 heads or 100 tails, there is exactly one possible configuration, so our knowledge of the system is complete.
At the opposite extreme, the macrostate which gives us the least knowledge about the system consists of 50 heads and 50 tails in any order, for which there are 100891344545564193334812497256 (100 choose 50) ≈ 1029 possible microstates.
Even when a system is entirely isolated from external influences, its microstate is constantly changing.
Suppose we prepare the system in an artificially highly ordered equilibrium state.
It is overwhelmingly probable for the gas to spread out to fill the container evenly, which is the new equilibrium macrostate of the system.
This is an example illustrating the second law of thermodynamics: Since its discovery, this idea has been the focus of a great deal of thought, some of it confused.
A chief point of confusion is the fact that the Second Law applies only to isolated systems.
For example, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight.
In contrast, the universe may be considered an isolated system, so that its total entropy is constantly increasing.
For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers.
If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set.
Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined.
An important result, known as Nernst's theorem or the third law of thermodynamics, states that the entropy of a system at zero absolute temperature is a well-defined constant.
Many systems, such as crystal lattices, have a unique ground state, and (since ln(1) = 0) this means that they have zero entropy at absolute zero.
For instance, ordinary ice has a zero-point entropy of 3.41 J/(mol⋅K), because its underlying crystal structure possesses multiple configurations with the same energy (a phenomenon known as geometrical frustration).