Introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time.

For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned".

The word 'entropy' has entered popular usage to refer to a lack of order or predictability, or of a gradual decline into disorder.

[1] A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

If a movie that shows coffee being mixed or wood being burned is played in reverse, it would depict processes highly improbable in reality.

While the second law, and thermodynamics in general, accurately predicts the intimate interactions of complex physical systems, scientists are not content with simply knowing how a system behaves, they also want to know why it behaves the way it does.

The question of why entropy increases until equilibrium is reached was answered in 1877 by physicist Ludwig Boltzmann.

For example, whenever there is a suitable pathway, heat spontaneously flows from a hotter body to a colder one.

imply that the heat transfer should be so small and slow that it scarcely changes the temperature

This calculation of entropy change does not allow the determination of absolute value, only differences.

In many cases, a visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so.

When applicable, entropy increase is the quantitative measure of that kind of a spontaneous process: how much energy has been effectively lost or become unavailable, by dispersing itself, or spreading itself out, as assessed at a specific temperature.

This is because a hotter body is generally more able to do thermodynamic work, other factors, such as internal energy, being equal.

Thermodynamics makes no assumptions about the atomistic nature of matter, but when matter is viewed in this way, as a collection of particles constantly moving and exchanging energy with each other, and which may be described in a probabilistic manner, information theory may be successfully applied to explain the results of thermodynamics.

An important concept in statistical mechanics is the idea of the microstate and the macrostate of a system.

The concept of information entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used.

When it is applied to the problem of a large number of interacting particles, along with some other constraints, like the conservation of energy, and the assumption that all microstates are equally likely, the resultant theory of statistical mechanics is extremely successful in explaining the laws of thermodynamics.

Energy has spontaneously become more dispersed and spread out in that 'universe' than when the glass of ice and water was introduced and became a 'system' within it.

Originally, entropy was named to describe the "waste heat", or more accurately, energy loss, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work.

Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level.

In the late 19th century, the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level.

For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the "motional" (i.e. kinetic) energy of molecules.

More recently, there has been a trend in chemistry and physics textbooks to describe entropy as energy dispersal.

Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.

, was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure.

Ice melting provides an example of entropy increasing .