We’ve all heard it. We think we understand it: entropy is a measure of disorder. Combined with the Second Law of Thermodynamics—that the total entropy of a closed system may never decrease—it seems we have a profound statement that the Universe is destined to become less ordered.

The consequences are unsettling. Sure, the application of energy can reverse entropy locally, but if our society enters an energy-scarce regime, how can we maintain order? It makes intuitive sense: an energy-neglected infrastructure will rust and crumble. And the Second Law stands as a sentinel, unsympathetic to deniers of this fact.

**mostly wrong**! An unfortunate conflation of the concepts of

**entropy**and

**disorder**has resulted in widespread misunderstanding of what

**thermodynamic entropy**actually means. And if you want to invoke the gravitas of the Second Law of Thermodynamics, you’d better make darned sure you’re talking about

*thermodynamic*entropy—whose connection to order is not as strong as you might be led to believe. Entropy can be quantified, in Joules per Kelvin. Let’s build from there.

**The Measure of Entropy**

*ΔE*(measured in Joules, say), to a system at temperature

*T*(measured on an absolute scale, like Kelvin), the entropy changes according to

*ΔS = ΔE/T*. The units are Joules per Kelvin.

*very*closely related to the

**heat capacity**of a system or object. If we measure for a substance how much the temperature changes when we add a bit of energy, the ratio is the heat capacity. Divide by the object’s mass and we have a property of the material: the

**specific heat capacity**. For example, the specific heat capacity of water is

*c*p ≈ 4184 J/kg/K. If we heat one liter (1 kg) of water by 10°C (same as a change by 10 K), it takes 41,840 J of energy. Most everyday substances (air, wood, rock, plastic) have specific heat capacities around 1000 J/kg/K. Metals have lower specific heat capacities, typically in the few-hundred J/kg/K range.

*ΔS = ΔE/T*, where

*ΔE*=

*c*p

*mΔT*, and

*m*is the mass of the object of interest.

*c*p

*m*, within a factor of a few. For a kilogram of ordinary matter, the total entropy therefore falls into the ballpark of 1000 J/K.

**Where is the Disorder?**

*Ω*, then the absolute entropy can also be described as

*S*=

*k*Bln

*Ω*, where

*k*B = 1.38×10−23 J/K is the Boltzmann constant (note that it has units of entropy), and ln() is the natural logarithm function. This relation is inscribed on Boltzmann’s tomb.

**statistical mechanics**, by which classical thermodynamics can be understood as the way energy distributes among microscopic states in the form of velocity distributions, collisions, vibrations, rotations, etc. Intuitively, the more ways energy can tuck into microscopic modes of motion, the less apparent it is to the outside world in the form of increased temperature. A system with deep pockets will not increase temperature as much for a given injection of energy. Substances with higher heat capacities have deep pockets, and therefore more ways to spread out the energy internally. The states of these systems require a greater amount of information to describe (e.g., rotational and vibrational modes of motion in addition to velocities, intermolecular interactions, etc.): they are a mess. This is the origin of the notion of entropy as disorder. But we must always remember that it is in the context of how energy can be distributed into the

*microscopic*states (microstates) of a system.

**Informational Entropy**

You should call it entropy…nobody knows what entropy really is, so in a debate you will always have the advantage.

**same**thermodynamic properties (including thermodynamic entropy)

*no matter how*the cards are sequenced within the deck. A shuffled deck has increased informational entropy, but is thermodynamically identical to the ordered deck.

**What’s the Difference?**

- If I took the system to zero temperature and then added energy until getting back to the original temperature, would the amount of energy required be different for the two configurations?
- Is there an intrinsic physical process by which one state may evolve to the other spontaneously? In other words, are the items continuously jostling about and changing configuration via collisions or some other form of agitation?

*particular*mixed state is just as special as the separated piles. If we had a removable barrier between separated piles and provided a random agitating process by which grains could rearrange themselves on relevant timescales, then we

*could*describe the entropy difference between the

*ensemble*of separated states

*with*a barrier to the

*ensemble*of mixed states

*without*a barrier. But we can’t really get away with discussing the entropy of a particular non-thermalized (static) arrangement.

**NITPICKY DIFFERENCE**

*exactly*true. Going back to the coffee/salt grains example, a system of two species of particles

*does*carry a finite and quantifiable entropic change associated with mixing—assuming some agitating mechanism exists. In the case where the number of grains per unit area is the same for the two clusters, the post-mixed arrangement (occupying the same area as the initial separate piles) has an entropy change of

*k*B is the tiny Boltzmann constant, and the

*N*values count the number of particles or grains in group 1 and group 2. Simplifying to the case where each group contains the same number of particles just gives

*ΔS*= 2

*Nk*Bln2, or about 1.4

*Nk*B.

**moles**of particles, so that

*N*≈ 1024 particles (the Avogadro number), and the mixing entropy comes out to something of order 10 J/K (compare to absolute entropy often around 1000 J/K). But dealing with macroscopic items, like grains of salt or coffee, we might have

*N*≈ 10,000, in which case the entropy difference in mixing is about 10−19 J/K.

*can be*a real thermodynamic difference between the two states, some twenty orders of magnitude down from the gross thermodynamic entropy of the system. Why do I use the words “can be” and not the simpler “is?” Because question 2 comes in. If there is no statistical process by which the particles can thermalize (mix) over timescales relevant to our interest, then the entropy difference has no meaning. If we apply the test in question 1 to the pre-mixed and post-mixed piles, the procedure does not provide an opportunity for random rearrangements, and thus no measured change in system entropy will manifest itself in an observable way.

**Some Examples**

A narrative has developed around this theme that we take in low entropy energy and emit a high entropy wake of waste. That life displays marvelous order—permitted by continuous feeding of this low entropy energy—while death and decay represent higher entropy end states. That we extract low entropy concentrations of materials (ores) from the ground, then disperse the contents around the world in a higher entropy arrangement. The Second Law warns that there is no going back: at least not without substantial infusion of energy.

**LOW ENTROPY ENERGY?**

*ΔS = ΔE/T*. In this way, the Sun’s total entropy actually

*decreases*with time (internally, it consolidates micro-particles: hydrogen into helium; externally, it spews photons, neutrinos, and solar wind hither and yon). So the Sun is a prodigious

*exporter*of entropy. Let’s say we catch this Joule of energy on Earth. When absorbed at a temperature of 300 K, we could say that we have deposited 3.3 mJ/K of entropy. So that Joule of energy does not have a fixed entropic price tag associated with it: 0.17 mJ/K became 3.3 mJ/K. If we cleverly divert the energy into a useful purpose, rather than letting it thermalize (heat something up), the Second Law requires that we at least increase terrestrial entropy by 0.17 mJ/K to balance the books. We are therefore mandated to deposit at least 0.17/3.3, or 5% (50 mJ) of the energy into thermal absorption, leaving 0.95 J free to do useful work. This results in a 95% efficiency, which is the standard thermodynamic limit associated with operation between 5800 K and 300 K (see related post on heat pumps).

**THE QUICK AND THE DEAD**

*system*entropy increased in the process of digesting food (e.g., via respirated gases). But the measure of thermodynamic entropy for a “thing” is not a measure of its usefulness.

**MINING MATERIALS**

*same*. If we somehow contrived the test of adding energy to bring this globally distributed copper from 0 K to 300 K, the amount of energy required (performed quickly enough that we may ignore diffusion into surrounding media) would be the same for the block as for the distributed mass. Macroscopic configuration changes don’t contribute measurably to changes in thermodynamic entropy.

*higher*thermodynamic entropy than a similar mass of raw material. So it’s not the purification, or ordering, that makes the entropy go down. It’s the thermodynamic properties with respect to how readily energy is absorbed and distributed into microstates.

**Interpretation**

*thermodynamic*entropy differences, and therefore do not fall under the jurisdiction of the Second Law.

- Energy may be extracted when temperature differences exist (e.g., combustion chamber compared to ambient environment; solar surface temperature compared to Earth surface). Entropy measures of the energy itself are not meaningful.
- Net energy from fossil fuels may only be extracted once.
- Efficiencies are capped at 100%, and often are theoretically much lower as a consequence of the Second Law.

*I thank Eric Michelsen, Kim Griest, and George Fuller for sharing insights in a fascinating discussion about the nature of entropy. It is said that when a group of scientists discusses entropy, they’ll be talking nonsense inside of ten minutes. I think we managed to steer clear of this common peril. I also learned from these links.*