In the beginning, it was all so simple. Rub two sticks together, get a fire. Stick a pipe in the ground, get some oil. Trade a cow, get a llama. Simple systems require only straightforward applications of engineering, with little need to examine precisely how individual components might interact. But as our global production system has evolved, so too has the level of complexity amongst the various components. Our society, based on ever-advancing technology of all kinds, has become a seething morass of indecipherable interactions between mind, body, finance, and resources.

What was once a world of isolated simple systems is now what we (so creatively) call a complex system. Complex systems don’t have straightforward relations between cause and effect (input and output) because there are such high numbers of interactions within the system. As such, complex systems fail in complex ways.

Despite his recent unenlightened comments with respect to global warming, NASA’s Administrator Dr. Mike Griffin is an extraordinarily brilliant man. Earlier this year he delivered a profound speech at Purdue University on the nature of complex systems and their failure modes.

As Griffin says, failure of complex systems comes about not because the systems fail to accomplish their nominal purpose, but as a result of unintended consequences of the interactions of the component parts. For example, an oil well system could be composed of numerous straight-bore well pipes, each of which is a “simple” system. The unintended consequences become apparent as neighboring wells depress the oil pay zone in each other’s vicinity, or as fixing that problem with water injection depletes aquifers, or as unrestricted oil extraction irreparably damages an oil field.

The impending disasters we face are all a direct result of similar unintended consequences. Global warming isn’t occurring because industrial machines failed to produce; the industrial infrastructure failed us because the complex interactions with the atmosphere were not taken into account (or were ignored). Peak Oil is not an economic disaster because the markets failed to drive the economy or the oil companies failed to produce crude; our economy faces collapse due to the lack of design engineering at the interface between the economy and its engine.

As seen in the recent analysis of phosphorus production by the Energy Bulletin, there are many inputs to the global production engine, the collapse of which I’ve termed the Global Resource Crunch. Some resources, such as oil, phosphorus, and gold face 1st order threats to their production rates — that is, their extraction rate is limited by Hubbert-type geologic constraints brought about by having exhausted the quantities of the substance. More subtle — yet equally critical — are the 2nd order threats to commodities like salt, crops, and coal. A 2nd order threat to a commodity means that its extraction rate can be limited by the production rate of some other product — think oil-based machinery for salt extraction (and pretty much every other type of extraction) or phosphorus for crop production.

What’s worse is that the 1st order crunches can feed 2nd order effects back upon themselves, such as how increasing oil prices make oil extraction more costly. What the mainstream media cannot possibly grasp is something that the Peak Oil community knows innately, even if we have difficulty articulating it: the global production system is every bit as complex, chaotic, and fragile as the global climate system.

In global warming we speak of feedback loops and tipping points, mechanisms that result in rapid and/or uncontrollable changes in climate. The same mechanisms are at work in the Global Resource Crunch. One critically important tipping point with which we’re all concerned is Peak Oil. As the input of oil into the global production system is decreased, it will trigger almost innumerable feedbacks and tipping points.

As Griffin points out, for complex, tightly coupled systems, failure is inevitable. This obvious statement is proved so often and dramatically it should be published as a universal law for all humans to consider every day. To clarify, every complex system — whether a drilling system, climate system, or bureaucracy — is destined to fail or shift states dramatically. This is particularly true when inputs are rapidly increased or decreased, such as with the after-effects of Peak Oil.

The point of this discussion is this: no isolated model of the economy, climate, or resource production can hope to completely capture the workings of any of these complex systems. Not only are these systems profoundly non-linear (and unstable) but they all interact with one another forming a much larger (and much more unstable) mega-system. That is not to say that we shouldn’t try to model complex systems, just that we must be prepared for rapid, uncontrollable collapse of the system in ways that a model cannot possibly predict.

As a controls engineer this is a terrifyingly intractable problem. The most we can hope to achieve is applying slight perturbations to systems in the hopes of pushing it to a more stable state. The Federal Reserve has been attempting this controls game in recent days with little nudges of cash infusion or interest rate bumps. Each move introduces another element of uncertainty and instability. The farther ahead they attempt to predict the consequences of their actions, the less clear it becomes, and the worse they’re making the inevitable collapse. What those in the Fed aren’t willing to recognize or admit is that such a complex economic system is required to fail by its very nature — and the more complex they make the system, the more dramatic the failure. This isn’t a trivial statement — the same effect is seen on a smaller scale in aircraft control systems (as Griffin describes).

Each level of complexity in a control system necessitates numerous careful safeguards and control checks to avoid system failure. So in order for failure to occur it must happen in such a way that it bypasses or overcomes the multitudes of safety features designed to prevent small failures. Hence, increasingly complex systems fail in increasingly dramatic ways.

We cannot know the exact path of failure, but we can make one solid prediction for the future: the system will fail.