The Cimmerian Hypothesis, Part Two: A Landscape of Hallucinations

July 23, 2015

NOTE: Images in this archived article have been removed.

Image Removed

Last week’s post covered a great deal of ground—not surprising, really, for an essay that started from a quotation from a Weird Tales story about Conan the Barbarian—and it may be useful to recap the core argument here. Civilizations—meaning here human societies that concentrate power, wealth, and population in urban centers—have a distinctive historical trajectory of rise and fall that isn’t shared by societies that lack urban centers. There are plenty of good reasons why this should be so, from the ecological costs of urbanization to the buildup of maintenance costs that drives catabolic collapse, but there’s also a cognitive dimension.

Look over the histories of fallen civilizations, and far more often than not, societies don’t have to be dragged down the slope of decline and fall. Rather, they go that way at a run, convinced that the road to ruin must inevitably lead them to heaven on earth. Arnold Toynbee, whose voluminous study of the rise and fall of civilizations has been one of the main sources for this blog since its inception, wrote at length about the way that the elite classes of falling civilizations lose the capacity to come up with new responses for new situations, or even to learn from their mistakes; thus they keep on trying to use the same failed policies over and over again until the whole system crashes to ruin. That’s an important factor, no question, but it’s not just the elites who seem to lose track of the real world as civilizations go sliding down toward history’s compost heap, it’s the masses as well.

Those of my readers who want to see a fine example of this sort of blindness to the obvious need only check the latest headlines. Within the next decade or so, for example, the entire southern half of Florida will become unfit for human habitation due to rising sea levels, driven by our dumping of greenhouse gases into an already overloaded atmosphere. Low-lying neighborhoods in Miami already flood with sea water whenever a high tide and a strong onshore wind hit at the same time; one more foot of sea level rise and salt water will pour over barriers into the remaining freshwater sources, turning southern Florida into a vast brackish swamp and forcing the evacuation of most of the millions who live there.

That’s only the most dramatic of a constellation of climatic catastrophes that are already tightening their grip on much of the United States. Out west, the rain forests of western Washington are burning in the wake of years of increasingly severe drought, California’s vast agricultural acreage is reverting to desert, and the entire city of Las Vegas will probably be out of water—as in, you turn on the tap and nothing but dust comes out—in less than a decade. As waterfalls cascade down the seaward faces of Antarctic and Greenland glaciers, leaking methane blows craters in the Siberian permafrost, and sea level rises at rates considerably faster than the worst case scenarios scientists were considering a few years ago, these threats are hardly abstract issues; is anyone in America taking them seriously enough to, say, take any concrete steps to stop using the atmosphere as a gaseous sewer, starting with their own personal behavior? Surely you jest.

No, the Republicans are still out there insisting at the top of their lungs that any scientific discovery that threatens their rich friends’ profits must be fraudulent, the Democrats are still out there proclaiming just as loudly that there must be some way to deal with anthropogenic climate change that won’t cost them their frequent-flyer miles, and nearly everyone outside the political sphere is making whatever noises they think will allow them to keep on pursuing exactly those lifestyle choices that are bringing on planetary catastrophe. Every possible excuse to insist that what’s already happening won’t happen gets instantly pounced on as one more justification for inertia—the claim currently being splashed around the media that the Sun might go through a cycle of slight cooling in the decades ahead is the latest example. (For the record, even if we get a grand solar minimum, its effects will be canceled out in short order by the impact of ongoing atmospheric pollution.)

Business as usual is very nearly the only option anybody is willing to discuss, even though the long-predicted climate catastrophes are already happening and the days of business as usual in any form are obviously numbered. The one alternative that gets air time, of course, is the popular fantasy of instant planetary dieoff, which gets plenty of attention because it’s just as effective an excuse for inaction as faith in business as usual. What next to nobody wants to talk about is the future that’s actually arriving exactly as predicted: a future in which low-lying coastal regions around the country and the world have to be abandoned to the rising seas, while the Southwest and large portions of the mountain west become more inhospitable than the eastern Sahara or Arabia’s Empty Quarter.

If the ice melt keeps accelerating at its present pace, we could be only a few decades form the point at which it’s Manhattan Island’s turn to be abandoned, because everything below ground level is permanently  flooded with seawater and every winter storm sends waves rolling right across the island and flings driftwood logs against second story windows. A few decades more, and waves will roll over the low-lying neighborhoods of Houston, Boston, Seattle, and Washington DC, while the ruined buildings that used to be New Orleans rise out of the still waters of a brackish estuary and the ruined buildings that used to be Las Vegas are half buried by the drifting sand. Take a moment to consider the economic consequences of that much infrastructure loss, that much destruction of built capital, that many people who somehow have to be evacuated and resettled, and think about what kind of body blow that will deliver to an industrial society that is already in bad shape for other reasons.

None of this had to happen. Half a century ago, policy makers and the public alike had already been presented with a tolerably clear outline of what was going to happen if we proceeded along the trajectory we were on, and those same warnings have been repeated with increasing force year by year, as the evidence to support them has mounted up implacably—and yet nearly all of us nodded and smiled and kept going. Nor has this changed in the least as the long-predicted catastrophes have begun to show up right on schedule. Quite the contrary: faced with a rising spiral of massive crises, people across the industrial world are, with majestic consistency, doing exactly those things that are guaranteed to make those crises worse.

So the question that needs to be asked, and if possible answered, is why civilizations—human societies that concentrate population, power, and wealth in urban centers—so reliably lose the capacity to learn from their mistakes and recognize that a failed policy has in fact failed.  It’s also worth asking why they so reliably do this within a finite and predictable timespan: civilizations last on average around a millennium before they crash into a dark age, while uncivilized societies routinely go on for many times that period. Doubtless any number of factors drive civilizations to their messy ends, but I’d like to suggest a factor that, to my knowledge, hasn’t been discussed in this context before.

Let’s start with what may well seem like an irrelevancy. There’s been a great deal of discussion down through the years in environmental circles about the way that the survival and health of the human body depends on inputs from nonhuman nature. There’s been a much more modest amount of talk about the human psychological and emotional needs that can only be met through interaction with natural systems. One question I’ve never seen discussed, though, is whether the human intellect has needs that are only fulfilled by a natural environment.

As I consider that question, one obvious answer comes to mind: negative feedback.

The human intellect is the part of each of us that thinks, that tries to make sense of the universe of our experience. It does this by creating models. By “models” I don’t just mean those tightly formalized and quantified models we call scientific theories; a poem is also a model of part of the universe of human experience, so is a myth, so is a painting, and so is a vague hunch about how something will work out. When a twelve-year-old girl pulls the petals off a daisy while saying “he loves me, he loves me not,” she’s using a randomization technique to decide between two models of one small but, to her, very important portion of the universe, the emotional state of whatever boy she has in mind.

With any kind of model, it’s critical to remember Alfred Korzybski’s famous rule: “the map is not the territory.” A model, to put the same point another way, is a representation; it represents the way some part of the universe looks when viewed from the perspective of one or more members of our species of social primates, using the idiosyncratic and profoundly limited set of sensory equipments, neural processes, and cognitive frameworks we got handed by our evolutionary heritage. Painful though this may be to our collective egotism, it’s not unfair to say that human mental models are what you get when you take the universe and dumb it down to the point that our minds can more or less grasp it.

What keeps our models from becoming completely dysfunctional is the negative feedback we get from the universe. For the benefit of readers who didn’t get introduced to systems theory, I should probably take a moment to explain negative feedback. The classic example is the common household thermostat, which senses the temperature of the air inside the house and activates a switch accordingly. If the air temperature is below a certain threshold, the thermostat turns the heat on and warms things up; if the air temperature rises above a different, slightly higher threshold, the thermostat turns the heat off and lets the house cool down.

In a sense, a thermostat embodies a very simple model of one very specific part of the universe, the temperature inside the house. Like all models, this one includes a set of implicit definitions and a set of value judgments. The definitions are the two thresholds, the one that turns the furnace on and the one that turns it off, and the value judgments label temperatures below the first threshold “too cold” and those above the second “too hot.” Like every human model, the thermostat model is unabashedly anthropocentric—“too cold” by the thermostat’s standard would be uncomfortably warm for a polar bear, for example—and selects out certain factors of interest to human beings from a galaxy of other things we don’t happen to want to take into consideration.

The models used by the human intellect to make sense of the universe are usually less simple than the one that guides a thermostat—there are unfortunately exceptions—but they work according to the same principle. They contain definitions, which may be implicit or explicit: the girl plucking petals from the daisy may have not have an explicit definition of love in mind when she says “he loves me,” but there’s some set of beliefs and expectations about what those words imply underlying the model. They also contain value judgments: if she’s attracted to the boy in question, “he loves me” has a positive value and “he loves me not” has a negative one.

Notice, though, that there’s a further dimension to the model, which is its interaction with the observed behavior of the thing it’s supposed to model. Plucking petals from a daisy, all things considered, is not a very good predictor of the emotional states of twelve-year-old boys; predictions made on the basis of that method are very often disproved by other sources of evidence, which is why few girls much older than twelve rely on it as an information source. Modern western science has formalized and quantified that sort of reality testing, but it’s something that most people do at least occasionally. It’s when they stop doing so that we get the inability to recognize failure that helps to drive, among many other things, the fall of civilizations.

Individual facets of experienced reality thus provide negative feedback to individual models. The whole structure of experienced reality, though, is capable of providing negative feedback on another level—when it challenges the accuracy of the entire mental process of modeling.

Nature is very good at providing negative feedback of that kind. Here’s a human conceptual model that draws a strict line between mammals, on the one hand, and birds and reptiles, on the other. Not much more than a century ago, it was as precise as any division in science: mammals have fur and don’t lay eggs, reptiles and birds don’t have fur and do lay eggs. Then some Australian settler met a platypus, which has fur and lays eggs. Scientists back in Britain flatly refused to take it seriously until some live platypuses finally made it there by ship. Plenty of platypus egg was splashed across plenty of distinguished scientific faces, and definitions had to be changed to make room for another category of mammals and the evolutionary history necessary to explain it.

Here’s another human conceptual model, the one that divides trees into distinct species. Most trees in most temperate woodlands, though, actually have a mix of genetics from closely related species. There are few red oaks; what you have instead are mostly-red, partly-red, and slightly-red oaks. Go from the northern to the southern end of a species’ distribution, or from wet to dry regions, and the variations within the species are quite often more extreme than those that separate trees that have been assigned to different species. Here’s still another human conceptual model, the one that divides trees from shrubs—plenty of species can grow either way, and the list goes on.

The human mind likes straight lines, definite boundaries, precise verbal definitions. Nature doesn’t. People who spend most of their time dealing with undomesticated natural phenomena, accordingly, have to get used to the fact that nature is under no obligation to make the kind of sense the human mind prefers. I’d suggest that this is why so many of the cultures our society calls “primitive”—that is, those that have simple material technologies and interact directly with nature much of the time—so often rely on nonlogical methods of thought: those our culture labels “mythological,” “magical,” or—I love this term—“prescientific.” (That the “prescientific” will almost certainly turn out to be the postscientific as well is one of the lessons of history that modern industrial society is trying its level best to ignore.) Nature as we experience it isn’t simple, neat, linear, and logical, and so it makes sense that the ways of thinking best suited to dealing with nature directly aren’t simple, neat, linear, and logical either.

 With this in mind, let’s return to the distinction discussed in last week’s post. I noted there that a city is a human settlement from which the direct, unmediated presence of nature has been removed as completely as the available technology permits. What replaces natural phenomena in an urban setting, though, is as important as what isn’t allowed there. Nearly everything that surrounds you in a city was put there deliberately by human beings; it is the product of conscious human thinking, and it follows the habits of human thought just outlined. Compare a walk down a city street to a walk through a forest or a shortgrass prairie: in the city street, much more of what you see is simple, neat, linear, and logical. A city is an environment reshaped to reflect the habits and preferences of the human mind.

I suspect there may be a straightforwardly neurological factor in all this. The human brain, so much larger compared to body weight than the brains of most of our primate relatives, evolved because having a larger brain provided some survival advantage to those hominins who had it, in competition with those who didn’t. It’s probably a safe assumption that processing information inputs from the natural world played a very large role in these advantages, and this would imply, in turn, that the human brain is primarily adapted for perceiving things in natural environments—not, say, for building cities, creating technologies, and making the other common products of civilization.

Thus some significant part of the brain has to be redirected away from the things that it’s adapted to do, in order to make civilizations possible. I’d like to propose that the simplified, rationalized, radically information-poor environment of the city plays a crucial role in this. (Information-poor? Of course; the amount of information that comes cascading through the five keen senses of an alert hunter-gatherer standing in an African forest is vastly greater than what a city-dweller gets from the blank walls and the monotonous sounds and scents of an urban environment.) Children raised in an environment that lacks the constant cascade of information natural environments provide, and taught to redirect their mental powers toward such other activities as reading and mathematics, grow up with cognitive habits and, in all probability, neurological arrangements focused toward the activities of civilization and away from the things to which the human brain is adapted by evolution.

One source of supporting evidence for this admittedly speculative proposal is the worldwide insistence on the part of city-dwellers that people who live in isolated rural communities, far outside the cultural ambit of urban life, are just plain stupid. What that means in practice, of course, is that people from isolated rural communities aren’t used to using their brains for the particular purposes that city people value. These allegedly “stupid” countryfolk are by and large extraordinarily adept at the skills they need to survive and thrive in their own environments. They may be able to listen to the wind and know exactly where on the far side of the hill a deer waits to be shot for dinner, glance at a stream and tell which riffle the trout have chosen for a hiding place, watch the clouds pile up and read from them how many days they’ve got to get the hay in before the rains come and rot it in the fields—all of which tasks require sophisticated information processing, the kind of processing that human brains evolved doing.

Notice, though, how the urban environment relates to the human habit of mental modeling. Everything in a city was a mental model before it became a building, a street, an item of furniture, or what have you. Chairs look like chairs, houses like houses, and so on; it’s so rare for humanmade items to break out of the habitual models of our species and the particular culture that built them that when this happens, it’s a source of endless comment. Where a natural environment constantly challenges human conceptual models, an urban environment reinforces them, producing a feedback loop that’s probably responsible for most of the achievements of civilization.

I suggest, though, that the same feedback loop may also play a very large role in the self-destruction of civilizations. People raised in urban environments come to treat their mental models as realities, more real than the often-unruly facts on the ground, because everything they encounter in their immediate environments reinforces those models. As the models become more elaborate and the cities become more completely insulated from the complexities of nature, the inhabitants of a civilization move deeper and deeper into a landscape of hallucinations—not least because as many of those hallucinations get built in brick and stone, or glass and steel, as the available technology permits. As a civilization approaches its end, the divergence between the world as it exists and the mental models that define the world for the civilization’s inmates becomes total, and its decisions and actions become lethally detached from reality—with consequences that we’ll discuss in next week’s post.

Image credit: "Campus-SKOLKOVO" by Moscow School of Management SKOLKOVO (www.skolkovo.ru), David Adjaye – http://www.skolkovo.ru. Licensed under CC BY-SA 3.0 via Wikimedia Commons.

John Michael Greer

John Michael Greer is a widely read author and blogger whose work focuses on the overlaps between ecology, spirituality, and the future of industrial society. He served twelve years as Grand Archdruid of the Ancient Order of Druids in America, and currently heads the Druidical Order of the Golden Dawn.

Tags: civilization, nature deficit disorder