Against Cultural Senility

May 26, 2016

NOTE: Images in this archived article have been removed.

Image Removed

For the connoisseur of sociopolitical absurdity, the last few weeks’ worth of news cycles very nearly defines the phrase “target-rich environment.” I note, for example, that arch-neoconservative Robert Kagan—the founder of the Project for a New American Century and principal architect of this nation’s idiotically bloodthirsty Middle East policies, a man who never met a body bag he didn’t like—has jumped party lines to endorse Hillary Clinton’s presidential ambitions.

Under other conditions I’d wonder if Kagan had decided to sandbag Clinton’s hopes, using a lethal dose of deadpan satire to point out that her policy stances are indistinguishable from those of George W. Bush: you know, the guy that so many Democrats denounced as evil incarnate just eight short years ago. Unfortunately, nothing so clever seems to be in the works. Kagan seems to be quite sincere in his adulation for Clinton. What’s more, his wife Victoria Nuland, a Hillary Clinton protegé in the State Department and a major player in the Obama administration’s pursuit of Cold War brinksmanship against Russia, is now being rumored as Clinton’s most likely pick for Secretary of State.

For unintended satire, that one’s hard to beat Still, I’d say it has been outdone by another recent story, which noted that the students at Brown University, one of this nation’s Ivy League universities, are upset. Turns out they’re so busy protesting for social justice these days that they don’t have enough time to keep up with their classwork, and yet their professors are still expecting papers to be turned in on time—a demand that strikes the students as grossly unfair. A savage parody off some right-wing website? Nope; the story appeared in the Brown University student paper earlier this month.

To be fair to the students, they’re not the only ones who have redefined the purpose of a university education in a way that, for the sake of politeness, we’ll call “quirky.” Radical faculty members, who encourage this reenactment of their vanished youth as a political equivalent of Münchausen syndrome by proxy, are doing much the same thing. Then, of course, you’ve got corporations who think that universities are places where prospective employees go to pay for their own job training, university bureaucrats who bubble marketing-firm sewage about offering students the “university experience,” and so on through an entire galaxy of self-regarding and self-important cant. The one thing that finds no place among all these competing redefinitions is, predictably enough, learning.

I’ve mentioned before on this blog the need to devise new opportunities for learning, and in particular a new structure for adult education that isn’t subservient to the increasingly blatant political and financial interests of the academic industry. More broadly, the concept of learning has been a core theme of this blog since it began—partly because modern industrial society’s stunning inability to learn the lessons of repeated failure looms so large in public life today, partly because learning ways to make sense of the world and practical skills for dealing with the converging crises of our time ranks high on the to-do list for anyone who takes the future seriously. I think, therefore, that it’s time to move that discussion to center stage, and talk about learning and education in the context of the Long Descent.

We could start that discussion in many different places, but the whinefest under way at Brown just now makes as good a springboard as any. We can, I think, presume that universities don’t exist for the sake of giving privileged youth a place to play at changing the world, before they settle down to a lifetime of propping up the status quo in corporate and government careers. Nor do they exist for any of the other dubious purposes mentioned above. What, then, is a university for?

That’s best approached by looking at the other two legs of the institutional tripod that once supported American education. In the long-gone days when the United States still had an educational system that worked, that system sorted itself out into three broad categories of schools: public schools, trade schools, and universities. Public schools existed for the purpose of providing the basic intellectual skills that would allow young people to participate in society as productive citizens. Trade schools existed for the purpose of teaching the technical skills that would allow graduates to find steady work in the skilled trades. In the trade school category, we can also include medical schools and the few law schools that existed then—most lawyers got their legal training through apprenticeship until well into the twentieth century—and other institutions meant to turn out trained professionals, such as divinity schools.

Then there were the universities. The grand old American habit of highfalutin’ obfuscation that used to double the length of commencement addresses and Congressional speeches alike makes it a bit difficult to tease out, from the rhetoric of the day, the intended purpose of a university education, but attending to what was actually taught there in the late nineteenth and very early twentieth centuries makes the point tolerably clear: universities existed to launch students into a full-on, face-first encounter with that foreign country we call the past. That’s why the university curriculum back then focused on such subjects as history, classics, literature, and the like—and why the word “literature” in an academic setting generally excluded anything written within living memory.

This was of course exactly the thing the educational revolutions of our time targeted and, for the most part, destroyed. Under the banner of “relevance,” reformers across the American academic scene in the 1960s and 1970s pushed for the replacement of the traditional curriculum with something more up-to-date, modern, progressive—in a word, fashionable. Alongside the great crusade for relevance came the proliferation of new departments and degree programs. Thereafter, what was left of the old curriculum was assailed by proponents of various flavors of postmodernism, and after that came what’s known in the academic biz as “critical theory”—that is, ideologies of condemnation and exclusion that focus on race, gender, and other markers of privilege and disprivilege in society.

All of these changes, among their other impacts, had the effect of distancing students from the collision with the past that was central to the older approach to university education. The crusade for relevance and the mass production of new departments and degree programs did this in a straightforward fashion, by redirecting attention from the past to the present—it’s not accidental that the great majority of the new departments and degree programs focused on one or another aspect of modernity, or that by “relevant” the educational radicals of the Sixties generally meant “written within our lifetimes.” The other two movements just named did the same thing, though, albeit in a somewhat subtler way.

The common theme shared by the various movements lumped together as “postmodernism” was the imposition of a thick layer of interpretive theory between the student and the text. The postmodernists liked to claim that their apparatus of theory enabled them to leap nimbly into and out of texts from every place and time while understanding them all, but that was precisely what the theory didn’t do. Instead, if you’ll excuse the metaphor, it functioned as a sort of intellectual condom, meant to prevent students from conceiving any unexpected ideas as a result of their intercourse with the past. Those of my readers who encountered the sort of scholarly publication that resulted will recall any number of “conversations with the text” written along these lines, which sedulously kept the text from getting a word in edgewise, while quoting Derrida et al. at dreary length in every second or third paragraph.

If postmodernism claimed to engage in a conversation with the text, though, critical theory—still the rage in many American universities these days—subjects it to a fair equivalent of the Spanish Inquisition: one by one, texts are hauled before a tribunal, tortured with an assortment of critical instruments until they confess, suffer condemnation for their purported errors, and are then dragged off by a yelling mob to be burnt at the stake. The erasure of the past here has two aspects. On the one hand, critical-theory proponents are fond of insisting that students should never be required to read any text that has been so condemned; on the other, one very effective way of learning nothing from the past is to be too busy preening oneself over one’s moral superiority to one’s ancestors to learn from anything they might have had to say.

Popular though these moves were in the academic industry, I’d like to suggest that they were disastrously misguided at best, and have played a large role in helping to generate a widespread, and seriously destructive condition in our collective life. I’ll give a suggestive name to that condition a little later on. First, I want to talk about why the suppression of the past is as problematic as it is.

Johann Wolfgang von Goethe liked to point out that a person who knows only one language doesn’t actually know any languages at all. He was quite right, too. Only when you learn a second language do you begin to discover how many things you thought were true about the universe are merely artifacts of the grammatical and semantic structure of your first language. Where that language is vague, so are your thoughts; where that language runs several distinct meanings together in a single word, so do you; where that language imposes arbitrary structures on the complexities of experience—why, unless you have some experience with another way of assembling the world into linguistic patterns, it’s a safe bet that you’ll do the same thing even when you’re not talking or even thinking in words.

Here’s an example. People who only speak English tend to think in terms of linear cause-and-effect relationships. Listen to Americans try to understand anything, and you’ll see that habit in full flower. If something happens, they want to know what one thing caused it, and what one thing will result from it. In the real world, it almost never happens that just one cause sets just one process in motion and has just one effect; in the real world, wildly complex, tangled chains of interaction go into even the simplest event, and spin out from there to infinity—but that’s not the way Americans like to think.

Why? Because the normal sentence structure in English has a subject—someone who causes an action—followed by a verb—the action of the subject—and then usually by an object—the thing on which the action has an effect. That’s our usual grammar, and so that’s the usual pattern of our thoughts.

There are, as it happens, plenty of languages that don’t have the same structure. In modern Welsh, for example, most sentences begin with a form of the verb “to be.” Where an English speaker would say “The children are playing in the yard,” a Welsh speaker would say “Mae’r plant yn chwarae yn yr ardd,” literally “It is the children at play in the yard.” Most English sentences imply a cause-and-effect relationship (the cause “children” have the effect “playing”), that is, while most Welsh sentences imply a complex condition of being (the current state of things includes the phenomena “children” in the condition of “playing”). If you know both languages well enough to think in both, you won’t default to either option—and you won’t necessarily be stuck with just those two options, either, because once you get used to switching from one to another, you can easily conceive of other alternatives.

What’s true of language, I’d like to suggest, is also true—and may in fact be even more true—of the ideas and preconceptions of an era: if you only know one, you don’t actually know one at all. Just as the person who knows only one language remains trapped in the grammatical and semantic habits of that language, the person who has only encountered the thought of one era remains trapped in the presuppositions, habitual notions, and unexamined assumptions of that era.

I’ve used the word “trapped,” but that choice of phrasing misstates one very important aspect of the phenomenon: the condition that results is very comfortable. Most of the big questions have easy answers, and those that are still open—well, everyone’s secure in the knowledge that once those are solved, by some linear extrapolation of the current methods of inquiry, the answers will by definition fit easily into the framework that’s already been established for them. Debates about what’s right and wrong, what’s true and false, what’s sane and stark staring crazy all take place within the limits of a universally accepted structure of ideas that are all the more powerful because nobody discusses them and most people don’t even consciously notice that they’re there.

The supposed openness to innovation and diversity that’s said to characterize modern industrial society does precisely nothing to counteract that effect. The vagaries of intellectual and cultural trends, and the antics of dissident subcultures in art, religion, and politics, all take place within the narrow limits of a conventional wisdom which, again, is not so much believed as tacitly assumed. Watch any avant-garde movement closely, and it’s not hard to notice that its idea of rebelling against the status quo amounts to taking the conventional wisdom just a little further than anyone else has gotten around to going recently—and when that loses its charm, you can bet that in a generation or so, some new movement will come along and do it all over again, and convince themselves that they’re being revolutionary in doing something their parents, grandparents, and great-grandparents did in their day.

Thus, for example, public masturbation as a form of performance art has been invented at intervals of thirty to forty years since the late nineteenth century. It’s happened so far, that I know of, in the 1890s, the 1920s, the 1950s, and the 1980s, and we can probably expect a new round any time now. Each of the self-proclaimed cutting-edge artistic movements that went in for this not especially interesting habit framed it as a revolutionary act, using whatever kind of grandiose rhetoric was popular just then; and then the crowds got bored, and three decades later the next generation was at it again.

The history of the flying car, which has been invented at regular intervals since the 1920s, follows exactly the same rhythm, and displays exactly the same total subservience to the conventional wisdom of modern industrial culture. (A case could probably be made that there’s no shortage of masturbatory features in our collective obsession with flying cars, but that’s a discussion for another time.) For the purposes of our present discussion, the flying car is a particularly useful example, because it points to the chief problem with unthinking subservience to the predigested thought of an era: people in that condition lose the ability to learn from their mistakes.

There are a galaxy of good reasons why we don’t have flying cars, after all. One of the most important is that the engineering demands of aircraft design and automobile design are almost exactly opposed to one another—the lighter an airplane is, the better it flies, while a car needs a fair amount of weight to have good traction; aircraft engines need to be optimized for speed, while car engines need to be optimized for torque, and so on through a whole series of contrasts. A flying car is thus by definition going to be mediocre both as a car and as a plane, and due to the added complexities needed to switch from one mode of travel to the other, it’s going to cost so much that for the same price you can get a good car and a good plane, with enough left over to pay hangar rental for quite some time.

None of this is particularly hard to figure out. What’s more, it’s been demonstrated over and over again by the flying cars that have been invented, patented, and tested repeatedly down through the years. That being the case, why do audiences at TED Talks still clap frantically when someone tells them that they can expect flying cars on the market any day now? Because the presuppositions of modern industrial society deny the existence of limits and inescapable tradeoffs, and when the lessons of failure point up the reality of these things, those lessons remain unlearnt.

I wish that all the consequences of subservience to unnoticed presuppositions were that harmless. Take any of the rising spiral of crises that are building up around modern industrial society these days; in every single case, the reason that the obviously necessary steps aren’t being done is that the conventional wisdom of our time forbids thinking about those steps, and the reason that the lessons of repeated failure aren’t being learned is that the conventional wisdom of our time denies that any such failures can happen. We live in an era of cultural senility, in which the vast majority of people stare blankly at an unwelcome future and keep on doing all the things that are bringing that future on.

The erasure of the past from the curriculum of American universities is far from the only factor that’s brought about that catastrophic reality, but I suspect its role in that process has been significant. The era of cultural senility came in when the generation of the Sixties, the generation that insisted on excising the past from its university education, hit its thirties and rose into positions of influence, and it’s gotten steadily worse since that time. The inability of our society to learn from its mistakes or question its preconceptions has thus become a massive political fact—and a massive political liability.

None of the consequences of that inability are particularly original. It so happens, for example, that a little less than 2500 years ago, influential voices in another rich and powerful democratic society embraced the same policies that Robert Kagan and his fellow neoconservatives have been promoting in our time. The backers of this Project for a New Athenian Century believed that these policies would confirm Athens’ hegemony over the ancient Greek world; what happened instead was a nightmare of imperial overstretch, war, and economic and political collapse, from which Athens, and Greece as a whole, never recovered. You can read all about it in the writings of Thucydides, one of the supposedly irrelevant authors that most educated people read before the 1960s and next to nobody reads today.

That’s an obvious benefit of reading Thucycides. Less obvious and even more important is the subtler insight that you can get from Thucydides, or for that matter from any long-dead author. Thucydides was not a modern politically correct American liberal, or for that matter a modern patriotically correct American neoconservative. His basic assumptions about the world differ drastically from those of any modern reader, and those assumptions will jar, over and over again, against the very different notions that form the automatic substructure of thought in the modern mind.

If Thucydides doesn’t offend you, in fact, you’re probably not paying attention—but that’s precisely the point. If you exercise the very modest amount of intellectual courage that’s needed to get past being offended, and try to understand why the world looked the way it did when seen through Thucydides’ view of the world and yours, your knowledge of your preconceptions and your ability to make sense of the world when it doesn’t happen to fit those preconceptions will both expand. Both those gains are well worth having as our society hurtles down its current trajectory toward an unwelcome future.

**********
Homework Assignment #1

Since this series of posts is on education, yes, there’s going to be homework. Your assignment for the next two weeks consists of choosing a book-length work of fiction that (a) you haven’t previously read, and (b) was written before 1900, and reading it. It can be anything that fits these capacious limits: Little Women, The Epic of Gilgamesh, The Scarlet Letter, The Tale of Genji, or something else entirely—take your pick. Whatever book you choose, read it cover to cover, and pay attention to the places where the author’s assumptions about the world differ from yours. Don’t pass judgment on the differences; just notice them, and think about what it would have been like to see the world the way the author did.

Photo credit: By Foto: Wienwiki / Walter Maderbacher, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=31664867

John Michael Greer

John Michael Greer is a widely read author and blogger whose work focuses on the overlaps between ecology, spirituality, and the future of industrial society. He served twelve years as Grand Archdruid of the Ancient Order of Druids in America, and currently heads the Druidical Order of the Golden Dawn.

Tags: Education