The singularity: The fantasy and its effect
This Thanksgiving I was discussing the idea of technological progress with my father. I asked him if he had ever heard the term "singularity." He recognized the word had something to do with physics, but did not know any meaning that related to our discussion of technology. Then, he went on to describe a view of technology that seemed strikingly similar to that espoused by believers in the so-called "technological singularity," a speed-up in the rate of technological change so immense that it would constitute a third revolution in human history alongside the agricultural and industrial revolutions.
But his explanation had a twist. He thought it very likely that this technological progress would result in the destruction of human civilization and perhaps all life on the planet within a century. Alas, he didn't see any way to stop it.
The idea that the advance of technology is speeding up is not a new one. And, the idea that technological progress may actually be putting us on a path to destruction precisely because we don't know when enough is enough has a long history as well. But perhaps the most pernicious idea of the three my father mentioned is that nothing can be done to stop it.
I was struck by how deeply the idea of inevitable, unstoppable, rapid technological progress had become ingrained in the culture. If my father--who reads a lot, but is not particularly versed in things scientific or technological--could describe this idea and its possible consequences in such great detail, then it must indeed have made its way into the minds of nearly every thinking and perhaps many nonthinking persons.
The effect of this idea has been threefold. First, the vast majority of people regard technological progress as an unalloyed blessing. Of course, they are, in part, confusing the availability of cheap energy to run the technology with the technology itself. Without cheap energy much of that technology would not be available to the masses. And, we would not have been able to build the necessary infrastructure nor been able to put the necessary number of people to work to develop so many new technologies.
Second, many people are also discounting the ill effects. If someone had told you at the beginning of the 20th century that the automobile would become ubiquitous in American life, that it would lead to tens of thousands of fatalities and countless injuries each year, that it would be a major cause of urban decline, that it would make our country dangerously dependent on foreign oil imported from the most politically unstable parts of the world, and that it would be a large contributor to climate change, would you not have joined a campaign to ban it? Yet, even today most people are largely blind to or at least have little concern about these clearly deleterious effects.
Third, faith in technology turns most people into citizen-couch potatoes. Since technology will fix everything, we'll put the technologists in charge and then sit back and wait for the miracles to arrive.
The persistence and depth of this conviction results not from the available evidence, but rather from a pseudo-religious belief in the innate goodness of technological progress. Ray Kurzweil, the high priest of the singularity idea, tells us in his tome, "The Singularity is Near," that humans have become joined to machines in their cultural evolution. So far, this is not news. Human ecologist William Catton Jr. made the same point in his 1980 book, "Overshoot," where he refers to human beings as homo colossus, a man-tool hybrid of extraordinary destructive power.
But Kurzweil goes on to say that evolution creates better solutions to the problems of survival, and that technological evolution as part of overall evolution inevitably makes humans more fit for survival. This, he says, is the necessary progression of the universe. That's not exactly what the original evolutionist, Charles Darwin, thought. Changes in living organisms are due to random mutations that are just that, changes. They do not have a purpose per se. The natural world simply sorts through these experiments (including presumably any human technological inventions), keeping the ones which make animals and plants more fit and discarding the ones that don't. Since this sorting takes place over many generations and sometimes many millennia, there is no good way to tell ahead of time what will work and what won't.
So, Kurzweil's faith that our technology will make us more fit for survival in the universe is, in reality, a religious view, not a scientific one. To be fair, Kurzweil does acknowledge many potential dangers from new technologies such as genetic research, nanotechnology and robotics. But he believes we can mitigate or eliminate those dangers with proper regulation.
The main problem with this worldview is that nature is almost entirely absent from it. In this view nature is something which we seek to understand in order to manipulate it for our benefit and for the benefit of other creatures when we deem it necessary. And, nature is something we can fix when we have to. Witness the many ideas for geoengineering the climate including giant mirrors in space to reflect a portion of the sunlight that would otherwise fall on the Earth and a proposal to seed the ocean with iron to increase algae growth, algae that will ultimately die and fall to the ocean floor thereby sequestering carbon.
First, the natural world is so complex that environmental education giant David Orr believes we will never solve the knowledge problem. For everything we learn about the natural world and how to manipulate it, we create an equal and consequential void of ignorance concerning the effects of our actions. When it came to chlorofluorocarbons--a set of chemicals used in refrigerators and spray cans--we almost found out too late that they were eating a hole in the ozone. Given our countless industrial and technological processes, we simply cannot know all their effects on our biosphere.
Second, those effects might be so severe that they could wipe out human civilization. Bill Joy, formerly the chief scientist for Sun Microsystems, wrote a widely read article for Wired back in 2000 about just such possibilities. The article entitled "Why the Future Doesn't Need Us" details the possibilities for the dissemination of designer viruses with the power to kill selectively, self-replicating nanobots that devour the world, and robotic intelligence too great for us to understand or control. The problems may seem like something out of science fiction, but at least the designer viruses and the self-replicating nanobots are in principle possible. Robotic intelligence that mimics and outpaces human intelligence is still just a dream. And, many debate whether such a thing is even possible. But if it were to come to pass, it would have enormous consequences, not all of them salutary for the human race or the biosphere.
Finally, there is the perception that technological progress is speeding up. But is it? After one hundred years, we are still dependent on the internal combustion engine for almost all of our land and much of our sea transportation. We were promised miracle cures for genetic diseases a decade ago, but they haven't arrived. After a half a century of research, we expected fusion reactors to be in place. But the latest international project promises to bring us commercial fusion power only by the mid-21st century. In truth, it is not altogether clear that we will ever be able to master fusion energy. Our main fuels by far remain fossil fuels, 86 percent by energy content. And, these fuels are heading toward depletion faster than anyone anticipated as the world economy and population grow, and as more and more people want access to high-energy lifestyles.
In reality, technology sometimes progresses in fits and starts, and sometimes not at all. Joseph Tainter, author of "The Collapse of Complex Societies," suggests that we may have reached an era of diminishing returns for technology and for the complexity it fosters. Complexity, Tainter explains, can increase the power and reach of a civilization. But increasing complexity will also eventually have diminishing and even negative returns to a society thereby endangering its very cohesiveness. He cites Roman and Mayan civilizations as examples.
An aura of inevitability surrounds the idea of technological progress. And, that aura implies meaningful progress for human society as well. But is that aura in reality merely a paralyzing agent that prevents careful examination of technology and its claims for the future? Humans have, in fact, stopped, slowed or restricted technology on a few occasions. Whether wisely or not, the nuclear power industry was essentially stopped in its tracks after the accident at the Three Mile Island reactor in Pennsylvania in 1979. The public wanted other solutions.
We should want other solutions now, too. Technology enthusiasts claim that new as yet created technologies will keep human society overflowing with the cheap energy it needs for the energy-hungry technological wonderworld of the future. And, yet despite all our new technology, oil discoveries continue to fall. Geology is remorseless and doesn't yield to mere faith in technology. The development of alternatives is lagging far behind our need for quick replacements. The effects of climate change are visiting us sooner than even the most pessimistic estimates had presumed.
The singularitarians tell us, "Just wait! The great breathtaking exponential acceleration of technological progress is about to begin and will play out over the next few decades." The new technologies that will emerge will solve the problems of energy supply, clean water, hunger, and even climate change. And, they will also lead to much greater longevity and far better human health.
But as the world hurtles toward peak oil, catastrophic climate change, widespread water shortages and further vast destruction of the biosphere, can we afford to wait for the singularity to arrive? Or do we need to be pragmatic and start addressing these issues now as well as we can, not just with our technology, but with a plan to change the very way in which we live to make our presence more consonant with the limits of the Earth?
What do you think? Leave a comment below. See our commenting guidelines.
Sign up for regular Resilience bulletins direct to your email.