There is no wealth but life. –John Ruskin

Have you ever considered the question: what is life? If we are aiming for a new economic system that will preserve and enhance life, rather than the current system, which more often than not seems to destroy and degrade life, perhaps we should consider what life is and how it is made possible. I recall learning about “living things” in high school biology classes, but always found the definitions of these “living things” to be somewhat vague. Let me try a physicist’s definition then, which might feel unfamiliar at first. A living thing is a kind of low-entropy-maintenance machine: a configuration of differentiated parts that succeeds in performing complex, interdependent functions for a prolonged period of time.

Having used the word “entropy” in the previous sentence, I should try to explain what it is. All living and non-living things (and hence all human economies, whether or not economists pay attention to the fact!) obey the laws of thermodynamics. The second law, in particular, introduces the concept of entropy and the idea that the entropy of a closed system must either remain constant or increase, but never fall. Entropy is a measure of how “special” a particular arrangement of parts is — the lower the entropy, the more “special” the arrangement. Life is “special.”

To illustrate this concept of “specialness,” imagine first a set of red and blue gas molecules, fifty of each say, bouncing around in a room. Which is more likely: (A) that all 50 red molecules will be in one half of the room and all 50 blue in the other half, or (B) that some roughly even mixture of red and blues will be present in both halves? Scenario B, is the less “special” and more likely one, but why? The answer is that there are many ways of arranging the molecules to have “some roughly even mixture” of red and blue — a great many pairs of molecules can be swapped between the halves without making a difference. However, with the perfect red and blue split, if any molecule is swapped with a partner in the other half of the room, then each half gets “contaminated” with one molecule of the “wrong” color — such a swap does make a difference. Hence what we see tends to be an equal mixture of each color, just because there are vastly many more ways of seeing an equal mixture.

Now I can state the notion of entropy precisely — the entropy of such a set of molecules is a number that is large when there are many ways of swapping pairs of molecules and getting the same overall state, and small when there are few ways of swapping them and getting the same overall state. Explicitly, an entropy S is given by Boltzmann’s entropy law:

S = k log W

Here k = 1.38 x 10−23 joule/kelvin (Boltzmann’s constant), W is the number of ways of swapping the components of a state (say red and blue molecules) without making an overall difference to that state and log W means “the natural logarithm of W” — the power you have to raise Euler’s number (e = 2.718) to in order to get W (for example if W is equal to e then log W is equal to 1, because e to the power 1 is e).

Boltzmann’s tomb, with his famous entropy law above the bust

That little equation of Boltzmann’s explains a huge number of phenomena. For example, why do hot things tend to get colder and cold things hotter? Easy — bring a hot thing and a cold thing into contact and it’s like the red and blue molecules all over again — there are many, many more ways for hot molecules and cold ones to get mixed together equally than for them to stay separated into a hot part and a cold part. So the temperature equalizes.

Another example: why do balls bounce lower and lower, but never start bouncing higher and higher? Easy — after they’re done falling, ball molecules are moving more, on average, than floor ones. During each bounce, there are more ways of sharing out this motion randomly amongst the ball and floor than there are of keeping all the faster molecules in the ball and all the slower molecules in the floor. So this sharing out is what happens, and the ball eventually stops bouncing. The opposite case — a ball spontaneously bouncing higher and higher — never happens in practice because it is so unlikely. That’s how you can tell a film is being played backwards; everything that happens is so unlikely that it is never seen to happen in practice. These examples demonstrate the second law of thermodynamics: the total entropy always increases and never decreases because of how incredibly unlikely a decrease is.

What about life and entropy? A living thing has a very low entropy compared to its surroundings, because there are not many ways of swapping its constituent parts and leaving it in an invariant state. For example, swapping molecules between your heart and brain wouldn’t leave you in “an invariant state” — it would kill you! In fact, coming into thermodynamic equilibrium with your surroundings is also known as being dead!

Next question: how is life able to maintain this low-entropy state, in apparent defiance of the second law? Well, life is part of the Earth-sun system. We can regard this as “a closed system” to a very good approximation — a vast ocean of space separates it from other systems. But the Earth alone (plus moon, of course!) is not “a closed system.” The sun — a nuclear fusion reactor — provides the Earth with a constant input of low-entropy “organized” energy in the form of high-intensity photons (particles of light). Plants use this energy to make food which animals (including humans) eat, keeping the low-entropy-maintenance machinery of life running.

The Earth-sun (plus moon) system, of which the human economy is a sub-system

Save for a few ocean vent ecosystems, this low-entropy input from the sun makes all life on Earth possible, and hence all human economies (again, whether or not economists pay attention to the fact!). When we humans burn reserves of oil and coal laid down over millennia in a geological eye-blink, we are liberating the low-entropy energy captured from ancient sunlight and buried deep underground.

The second law of thermodynamics has profound implications for our economic systems. A constant stream of low-entropy energy from the sun is required to maintain life’s organized state. Without this “entropy gradient” the machinery of life would soon wind down, like the bouncing balls or mixing molecules did. So in order to prolong life on Earth, we should try to use this vital low-entropy input as efficiently as possible, to recycle it through all sectors of the economy. We should certainly not waste it and assume that we will be able to increase our use of it more and more and more, forever.

Unfortunately, most mainstream economists don’t seem to have heard of the second law of thermodynamics. Perhaps this isn’t really their fault, since it’s not in their textbooks. But it should be. It governs all life and all systems on Earth, including the economy. As our leaders in business and government race to implement misguided economic models that are not founded upon the laws of thermodynamics, and as nation after nation refuses to question the pursuit of never-ending economic growth, we draw closer to a fate that will end in tears for the human race. I worry that the tears have already begun falling.

David A. Jones is a PhD student in theoretical physics at Southampton University in the UK. He writes frequently for the Positive Money blog.