[1][2] The paradox allows for the entropy of closed systems to decrease, violating the second law of thermodynamics.
If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation, in the thermodynamic limit, the paradox is averted.
Closing the door then reduces the entropy again to S per box, in apparent violation of the second law of thermodynamics.
In particular, Gibbs' non-extensive entropy quantity for an ideal gas is not intended for situations where the number of particles changes.
This can be thought of as specifying a point in a 6N-dimensional phase space, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles.
In classical physics, the number of states is infinitely large, but according to quantum mechanics it is finite.
One can qualitatively see this from Heisenberg's uncertainty principle; a volume in N phase space smaller than h3N (h is the Planck constant) cannot be specified.
This leads us to another problem: The volume seems to approach zero, as the region in phase space in which the system can be is an area of zero thickness.
In a generic system without symmetries, a full quantum treatment would yield a discrete non-degenerate set of energy eigenstates.
for large N. This means that the above "area" φ must be extended to a shell of a thickness equal to an uncertainty in momentum
Using Stirling's approximation for the Gamma function which omits terms of less than order N, the entropy for large N becomes: This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy.
One can safely make this assumption provided the gas isn't at an extremely high density.
Under normal conditions, one can thus calculate the volume of phase space occupied by the gas, by dividing Equation 1 by N!.
The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas – a paradoxical discontinuity.
As soon as we can distinguish the difference between gases, the work necessary to recover the pre-mixing macroscopic configuration from the post-mixing state becomes nonzero.
This line of reasoning is particularly informative when considering the concepts of indistinguishable particles and correct Boltzmann counting.
Just as the mixing paradox begins with two detectably different containers, and the extra entropy that results upon mixing is proportional to the average amount of work needed to restore that initial state after mixing, so the extra entropy in Boltzmann's original derivation is proportional to the average amount of work required to restore the simple gas from some "exchange macrostate" to its original "exchange macrostate".
It is often said that the resolution to the Gibbs paradox derives from the fact that, according to the quantum theory, like particles are indistinguishable in principle.
By Jaynes' reasoning, if the particles are experimentally indistinguishable for whatever reason, Gibbs paradox is resolved, and quantum mechanics only provides an assurance that in the quantum realm, this indistinguishability will be true as a matter of principle, rather than being due to an insufficiently refined experimental capability.
In this section, we present in rough outline a purely classical derivation of the non-extensive entropy for an ideal gas considered by Gibbs before "correct counting" (indistinguishability of particles) is accounted for.
Finally, we present a third method, due to R. Swendsen, for an extensive (additive) result for the entropy of two systems if they are allowed to exchange particles with each other.
In contrast to the one-dimensional line integrals encountered in elementary physics, the contour of constant energy possesses a vast number of dimensions.
The justification for integrating over phase space using the canonical measure involves the assumption of equal probability.
The assumption can be made by invoking the ergodic hypothesis as well as the Liouville's theorem of Hamiltonian systems.
This may explain the difficulties in constructing a clear and simple derivation for the dependence of entropy on the number of particles.
Since our only purpose is to illuminate a paradox, we simplify notation by taking the particle's mass and the Boltzmann constant equal to unity:
We represent points in phase-space and its x and v parts by n and 2n dimensional vectors: To calculate entropy, we use the fact that the (n-1)-sphere,
As indicated by the underbrace, the integral over velocity space is restricted to the "surface area" of the n − 1 dimensional hypersphere of radius
Entropy is defined with an additive arbitrary constant because the area in phase space depends on what units are used.
An alternative approach is to argue that the dependence on particle number cannot be trusted on the grounds that changing