The old chestnut about living in interesting times may not actually be a Chinese curse, as today’s urban folklore claims, but it certainly comes to mind when glancing back over the smoldering wreckage of the past week. In the wake of a political crisis here in America that left both sides looking more than ever like cranky six-year-olds, a long-overdue downgrade of America’s unpayable debt, and yet another round of fiscal crisis in the Eurozone, stock and commodity markets around the globe roared into a power dive from which, as I write this, they show no sign of recovering any time soon.
In England, meanwhile, one of those incidents Americans learned to dread in the long hot summers of the Sixties—a traffic stop in a poor minority neighborhood, a black man shot dead by police under dubious circumstances—has triggered four nights of looting and rioting, as mobs in London and elsewhere organized via text messages and social media, brushed aside an ineffectual police presence, plundered shops and torched police stations, and ripped gaping holes in their nation’s already shredding social fabric. It seems that “Tottenham” is how the English pronounce “Watts,” except that the fire this time is being spread rather more efficiently with the aid of Blackberries and flashmobs.
Government officials denounced the riots as “mindless thuggery,” but it’s considerably more than that. As one looter cited in the media said, “this is my banker’s bonus”—the response of the bottom of the social pyramid, that is, to a culture of nearly limitless corruption further up. It bears remembering that the risings earlier this year in Tunisia, Egypt, and elsewhere began with exactly this sort of inchoate explosion of rage against governments that responded to economic crisis by tightening the screws on the poor; it was only when the riots showed the weakness of the existing order that more organized and ambitious movements took shape amid the chaos. It’s thus not outside the bounds of possibility, if the British government keeps on managing the situation as hamhandedly as it’s done so far, that the much-ballyhooed Arab Spring may be followed by an English Summer—and just possibly thereafter by a European Autumn.
One way or another, this is what history looks like as it’s happening. Those of my readers who have been following along for a year or two, and have made at least a decent fraction of the preparations I’ve suggested, are probably as well prepared for the unfolding mess as anyone is likely to be. Those who have just joined the conversation, or were putting aside preparations for some later date—well, once the rubble stops bouncing and the smoke clears, you’ll have the chance to assess what possibilities are still open and what you have the resources to accomplish. In the meantime, I want to continue the sequence of posts already under way, and discuss another of the things that’s going to have to be salvaged as the current system grinds awkwardly to a halt.
The theme of this week’s discussion, I’m sorry to say, is another issue split down the middle by the nearly Gnostic dualisms that bedevil contemporary American society. Just as Democrats and Republicans denounce each other in incandescent fury, and fundamentalist atheists compete with fundamentalist Christians in some sort of Olympics of ideological intolerance, the issues surrounding health care in America these days have morphed unhelpfully into a bitter opposition between the partisans of mainstream medicine and the proponents of alternative healing. The radicals on both sides dismiss the other side as a bunch of murderous quacks, while even those with more moderate views tend to regard the other end of the spectrum through a haze of suspicion tinged with bad experiences and limited knowledge.
I stay out of such debates as often as I can, but this one hasn’t given me that choice. Ironically, that’s because I’ve experienced both sides of the issue. On the one hand, I’m alive today because of modern medicine. At the age of seven, I came down with a serious case of scarlet fever. That’s a disease that used to kill children quite regularly, and in a premodern setting, it almost certainly would have killed me. As it was, I spent two weeks flat on my back, and pulled through mostly because of horse doctor’s doses of penicillin, administered first with syringes that to my seven-year-old eyes looked better suited for young elephants, and thereafter in oral form, made palatable with an imitation banana flavoring I can still call instantly to mind.
Then there’s the other side of the balance. My wife has lifelong birth defects in her legs and feet, because her mother’s obstetrician prescribed a drug that was contraindicated for pregnant women because it causes abnormalities in fetal limb development. My only child died at birth because my wife’s obstetrician did exactly the same thing, this time with a drug that was well known to cause fatal lung abnormalities. Several years later we found out by way of a media exposé that the latter doctor had done the same thing to quite a few other women, leaving a string of dead babies in his wake. The response of the medical board, once the media exposure forced them to do something, was quite standard; they administered a mild reprimand. If this reminds you of the Vatican’s handling of pedophile priests, well, let’s just say the comparison has occurred to me as well.
Deaths directly caused by American health care are appallingly common. A widely cited 2000 study by public health specialist Dr. Barbara Starwood presented evidence that bad medical care kills more Americans every year than anything but heart disease and cancer, with adverse drug effects and nosocomial (hospital- and clinic-spread) infections the most common culprits. A more comprehensive study prepared outside the medical mainstream, but based entirely on data from peer-reviewed medical journals, argued that the actual rate was much higher—higher, in fact, than any other single cause. That’s part of what makes the controversies over American health care so challenging; mainstream medical care saves a lot of lives in America, but because of the pressures of the profit motive, and the extent to which institutional barriers protect incompetent practitioners and dangerous and ineffective remedies, it also costs a lot of lives as well.
Even so, if I could find a competent, affordable general practitioner to give me annual checkups and help me deal with the ordinary health issues middle-aged men tend to encounter, I’d be happy to do so. The catch here is that little word “affordable.” Along with those birth defects, my wife has celiac disease, a couple of food allergies, and a family history with some chronic health problems in it; for that matter, my family history is by no means squeaky clean; we’re both self-employed, and so health insurance would cost us substantially more than our mortgage. That’s money we simply don’t have. Like a large and growing fraction of Americans, therefore, we’ve turned to alternative medicine for our health care.
The more dogmatic end of the mainstream medical industry tends to dismiss all alternative healing methods as ineffective by definition. That’s self-serving nonsense; the core alternative healing modalities, after all, are precisely the methods of health care that were known and practiced in the late 19th century, before today’s chemical and surgical medicine came on the scene, and they embody decades or centuries of careful study of health and illness. There are things that alternative health methods can’t treat as effectively as the current mainstream, of course, but the reverse is also true.
Still, behind the rhetoric of the medical industry lies a fact worth noting: alternative medical methods are almost all much less intensive than today’s chemical and surgical medicine. The best way to grasp the difference is to compare it to other differences between life in the late 19th century and life today—say, the difference between walking and driving a car. Like alternative medicine, walking is much slower, it requires more personal effort, and there are destinations that, realistically speaking, are out of its reach; on the other hand, it has fewer negative side effects, costs a lot less, and dramatically cuts your risk of ending up buttered across the grill of a semi because somebody else made a mistake.
Those differences mean that you can’t use alternative medicine the way you use the mainstream kind. If I neglect a winter cold, for example, I tend to end up with bacterial bronchitis. A physician nowadays can treat that with a simple prescription of antibiotics, and unless the bacterium happens to be resistant—an issue I’ll be discussing in more detail in a bit—that’s all there is to it. If you’re using herbs, on the other hand, handling bacterial bronchitis is a more complex matter. There are very effective herbal treatments, and if you know them, you know exactly what you’re getting and what the effects will be. On the other hand, you can’t simply pop a pill and go on with your day; you have to combine the herbal infusions with rest and steam inhalation, and pay attention to your symptoms so you can treat for fever or other complications if they arise. You very quickly learn, also, that if you don’t want the bronchitis at all, you can’t simply ignore the first signs of an oncoming cold; you have to notice it and treat it.
Here’s another example. I practice t’ai chi, and one of the reasons is that it’s been documented via controlled studies to be effective preventive medicine for many of the chronic health problems Americans tend to get as they get old. You can treat those same problems with drugs, to be sure, if you’re willing to risk the side effects, but again, you can’t just pop a t’ai chi pill and plop yourself back down on the sofa. You’ve got to put in at least fifteen minutes of practice a day, every day, to get any serious health benefits out of it. (I do more like forty-five minutes a day, but then I’m not just practicing it for health.) It takes time and effort, and if you’ve spent a lifetime damaging your health and turn to t’ai chi when you’re already seriously ill, it’s unlikely to do the trick.
All these points are relevant to the core project of this blog, in turn, because there’s another difference between alternative health care and the medical mainstream. All the core alternative modalities were all developed before the age of cheap abundant fossil fuel energy, and require very little in the way of energy and raw material inputs. Conventional chemical and surgical medicine is another thing entirely. It’s wholly a creation of the age of petroleum; without modern transport and communications networks, gargantuan supply chains for everything from bandages through exotic pharmaceuticals to spare parts for lab equipment, a robust electrical supply, and many other products derived from or powered by cheap fossil fuels, the modern American medical system would grind to a halt.
In the age of peak oil, that level of dependency is not a survival trait, and it’s made worse by two other trends. The first, mentioned earlier in this post, is the accelerating spread of antibiotic resistance in microbes. The penicillin that saved my life in 1969 almost certainly wouldn’t cure a case of scarlet fever today; decades of antibiotic overuse created a textbook case of evolution in action, putting ferocious selection pressure on microbes in the direction of resistance. The resulting chemical arms race is one that the microbes are winning, as efforts by the pharmaceutical industry to find new antibiotics faster than microbes can adapt to them fall further and further behind. Epidemiologists are seriously discussing the possibility that within a few decades, mortality rates from bacterial diseases may return to 19th-century levels, when they were the leading cause of death.
The second trend is economic. The United States has built an extraordinarily costly and elaborate health care system, far and away the most expensive in the world, on the twin pillars of government subsidies and employer-paid health benefits. As we lurch further into what Paul Kennedy called “imperial overstretch”—the terminal phase of hegemony, when the costs of empire outweigh the benefits but the hegemonic power can’t or won’t draw back from its foreign entanglements—the government subsidies are going away, while health benefits on the job are being gutted by rising unemployment rates and the frantic efforts of the nation’s rentier class to maintain its standards of living at the expense of the middle classes and the poor.
Requiring people who can’t afford health insurance at today’s exorbitant rates to pay for it anyway under penalty of law—the centerpiece of Obama’s health care “reform”—was a desperation move in this latter struggle, and one that risks a prodigious political backlash. If Obama’s legislation takes effect as written in 2014, and millions of struggling American families find themselves facing a Hobson’s choice between paying a couple of thousand a month or more for health insurance they can’t afford, or paying heavy fines they can’t afford either, it’s probably a safe bet that the US will elect a Tea Party president in 2016 and repeal that—along with much else. Whether that happens or not, it’s clear at this point that the United States can no longer afford the extraordinarily costly health care system it’s got, and the question at this point is simply what will replace it.
In the best of all possible worlds, the existing medical system would come to terms with the bleak limits closing in around it, and begin building a framework that could provide basic health care at a reasonable price to the poor and working classes. It actually wouldn’t be that difficult, but it would require the medical industry to remove at least some of the barriers that restrict medical practice to a small number of very highly paid professionals, and to accept significant declines in quarterly profits, doctors’ salaries, and the like. Maybe that could happen, but so far there doesn’t seem to be any sign of a movement in that direction. Instead, health care costs continue to rise as the economy stalls, moving us deeper into a situation where elaborate and expensive health care is available to a steadily narrowing circle of the well-to-do, while everyone outside the circle has to make do with what they can afford—which, more and more often, amounts to the 19th-century medicine provided by alternative health care.
Thus I’m not especially worried about the survival of alternative healing. Despite the fulminations of authority figures and the occasional FDA witch hunt, the alternative healing scene is alive and well, and its reliance on medicines and techniques that were viable before the age of cheap abundant fossil fuels means that it will be well equipped to deal with conditions after cheap energy of any kind is a thing of the past. No, what concerns me is the legacy of today’s mainstream medicine—the medicine that saved my life at age seven, and continues, despite its difficulties and dysfunctions, to heal cases that the best doctors in the world a century and a quarter ago had to give up as hopeless.
Even if a movement of the sort I’ve suggested above were to take place, a great deal of that would be lost or, at best, filed away for better times. The most advanced medical procedures at present require inputs that a deindustrial society simply isn’t going to be able to provide. Still, there’s quite a bit that could be saved, if those who have access to the techniques in question were to grasp the necessity of saving them. As it stands, the only people who can salvage those things are the physicians who are legally authorized to use them; the rest of us can at best get a working grasp of sanitation and sterile procedure, the sort of wilderness-centered first aid training that assumes that a paramedic won’t be there in ten minutes, and the sort of home nursing skills that the Red Cross used to teach in the 1950s and 1960s—you can still find the Red Cross Home Nursing Manual in the used book market, and it’s well worth getting a copy and studying it.
Other than that, it’s up to the physicians and the various institutions they staff and advise. If they step up to the plate, the deindustrial future will have the raw materials from which to evolve ways of healing that combine the best of mainstream and alternative methods. If they don’t, well, maybe enough written material will survive to enable the healers of the future to laboriously rediscover and reinvent some of today’s medical knowledge a few centuries down the road. While the decision is being made, those of us who don’t have a voice in it have our own decisions to make: if we have the money and are willing to accept one set of risks, to make use of today’s chemical and surgical medicine while it’s still around; if we have the interest and are willing to accept another set of risks, to make use of one or more methods of alternative medicine; or if neither option seems workable or desirable, to come to terms with a reality that all of us are eventually going to have to accept anyway, which is that life and health are fragile transitory things, and that despite drugs and surgeries on the one hand, or herbs and healing practices on the other, the guy with the scythe is going to settle the matter sooner or later with the one answer every human being gets at last.