Entropy of mixing

The entropy of mixing is entirely accounted for by the diffusive expansion of each material into a final volume not initially accessible to it.

In the general case of mixing non-ideal materials, however, the total final common volume may be different from the sum of the separate initial volumes, and there may occur transfer of work or heat, to or from the surroundings; also there may be a departure of the entropy of mixing from that of the corresponding ideal case.

These energy and entropy variables and their temperature dependences provide valuable information about the properties of the materials.

The entropy of mixing provides information about constitutive differences of intermolecular forces or specific molecular effects in the materials.

In this case, the increase in entropy is entirely due to the irreversible processes of expansion of the two gases, and involves no heat or work flow between the system and its surroundings.

[3][4] For binary mixtures the entropy of random mixing can be considered as a function of the mole fraction of one component.

For example, triethylamine and water are miscible in all proportions below 19 °C, but above this critical temperature, solutions of certain compositions separate into two phases at equilibrium with each other.

This is due to the formation of attractive hydrogen bonds between the two components that prevent random mixing.

The mixing that occurs below 19 °C is due not to entropy but to the enthalpy of formation of the hydrogen bonds.

[6] For polar systems such as polyacrylic acid in 1,4-dioxane, this is often due to the formation of hydrogen bonds between polymer and solvent.

For nonpolar systems such as polystyrene in cyclohexane, phase separation has been observed in sealed tubes (at high pressure) at temperatures approaching the liquid-vapor critical point of the solvent.

Mixing therefore requires contraction of the solvent for compatibility of the polymer, resulting in a loss of entropy.

Of course, any idea of identifying molecules in given locations is a thought experiment, not something one could do, but the calculation of the uncertainty is well-defined.

Claude Shannon introduced this expression for use in information theory, but similar formulas can be found as far back as the work of Ludwig Boltzmann and J. Willard Gibbs.

Multiplying by the number of particles N yields the change in entropy of the entire system from the unmixed case in which all of the pi were either 1 or 0.

Consequently, that part of the spatial uncertainty concerning whether any molecule is present in a lattice cell is the sum of the initial values, and does not increase upon "mixing".

Using conditional probabilities, it turns out that the analytical problem for the small subset of occupied cells is exactly the same as for mixed liquids, and the increase in the entropy, or spatial uncertainty, has exactly the same form as obtained previously.

The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed phases.

If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid.

In this case, the assumption is made that each monomer subunit in the polymer chain occupies a lattice site.

[7]: 273 According to Fowler and Guggenheim (1939/1965),[8] the conflating of the just-mentioned two mechanisms for the entropy of mixing is well established in customary terminology, but can be confusing unless it is borne in mind that the independent variables are the common initial and final temperature and total pressure; if the respective partial pressures or the total volume are chosen as independent variables instead of the total pressure, the description is different.

[9][10]: 163–164 [11]: 217 [12][13][14] This constant volume kind of "mixing", in the special case of perfect gases, is referred to in what is sometimes called Gibbs' theorem.

The two distinct gases, in a cylinder of constant total volume, are at first separated by two contiguous pistons made respectively of two suitably specific ideal semipermeable membranes.

Ideally slowly and fictively reversibly, at constant temperature, the gases are allowed to mix in the volume between the separating membranes, forcing them apart, thereby supplying work to an external system.

Then, by externally forcing ideally slowly the separating membranes together, back to contiguity, work is done on the mixed gases, fictively reversibly separating them again, so that heat is returned to the heat reservoir at constant temperature.

The "paradox" arises because any detectable constitutive distinction, no matter how slight, can lead to a considerably large change in amount of entropy as a result of mixing.

But from a specifically thermodynamic viewpoint, it is not paradoxical, because in that discipline the degree of constitutive difference is not questioned; it is either there or not there.

Differences of constitution are explained by quantum mechanics, which postulates discontinuity of physical processes.

The entirety of prevention should include perfect efficacy over a practically infinite time, in view of the nature of thermodynamic equilibrium.

Such quantum phenomena as tunneling ensure that nature does not allow such membrane ideality as would support the theoretically demanded continuous decrease, to zero, of detectable distinction.

The entropy of mixing for an ideal solution of two species is maximized when the mole fraction of each species is 0.5.