This article is the part 4 from Chapter 4 of Richard Heinberg’s new book The End of Growth, which is set for publication by New Society Publishers in August 2011. This chapter explores the possibilities of innovation, substitution and efficiency to maintain economic growth.
It is a truism in most people’s minds that the most important driver of economic growth is new technology. Important innovations, from the railroad and the telegraph up through the satellite and the cell phone, have generated fortunes while creating markets and jobs. It may seem downright cynical to suggest that we won’t see more of the same, leading to an abundant, technotopian future in which humanity has colonized space and all our needs are taken care of by obedient robots. But once again, there may be limits.
The idea that technology will continue to improve dramatically is often supported by reference to Moore’s law. Over the past three decades, the number of transistors that can be placed inexpensively on an integrated circuit has doubled approximately every two years. Computer processing speed, memory capacity, and the number of pixels per dollar in digital cameras have all followed the same trajectory. This “law” is actually better thought of as a trend—but it is a trend that has continued for over a generation and is not expected to stop until 2015 or later. According to technology boosters, if the same innovative acumen that has led to Moore’s law were applied to solving our energy, water, climate, and food problems, those problems would disappear in short order.
My first computer was an early laptop, circa 1986. It cost $1600 and had no hard disk—just two floppy drives—plus a small non-backlit, black-and-white LCD screen. It boasted 640K RAM internal memory and a processing speed of 9.54 MH. I thought it was wonderful! Its capabilities dwarfed those of the Moon-landing Apollo 11 spacecraft on-board computer, developed by NASA over a decade earlier.
My most recent laptop cost $1200 (that’s a lot to pay these days, but it’s a deluxe model), has a 250 gigabyte hard drive (holding about 200,000 times as much data as you could cram onto an old 3.5 inch floppy disk), 4 Gigs of internal memory, and a processing speed of 2.4 gigahertz. Its color LCD screen is stunning, and it does all sorts of things I could never have dreamed of doing with my first computer: It has a built-in camera so I can take still or moving pictures, it has sound, it plays movies—and, of course, it connects to the Internet! Many of those features are now standard even on machines selling for $300.
From 1986 to today, in just 25 years, the typical consumer-grade personal computer has increased in performance thousands of times over while dropping in price—noticeably so if inflation is factored in.
So why hasn’t the same thing happened with energy, transportation, and food production during this period? If it had, by now a new car would cost $750 and get 2000 miles to the gallon. But of course that’s not the case. Is the problem simply that engineers in non-computer industries are lazy?
Of course not. It’s because microprocessors are a special case. Moving electrons takes a lot less energy than moving tons of steel or grain. Making a two-ton automobile requires a heap of resources, no matter how you arrange and rearrange them. In many of the technologies that are critically important in our lives, recent decades have seen only minor improvements—and many or most of those have come about through the application of computer technology.
Take the field of ground transportation (the example I’m about to use is also relevant to the energy efficiency and substitution discussions earlier in this chapter). We could make getting to and from stores and offices far more efficient by installing personal rapid transit (PRT) systems in every city in the world. PRT consists of small, automated vehicles operating on a network of specially built guide-ways (a pilot system has been built at Heathrow airport in London). The energy-per-passenger-mile efficiency of PRT promises to be much greater than that for personal automobiles, even electric ones, and greater even than for trolleys, streetcars, buses, subways, and other widely deployed forms of public transit. According to some estimates, a PRT system should attain an energy efficiency of 839 BTU per passenger mile (0.55 MJ per passenger km), as compared to the 3,496 BTU per passenger mile average for automobiles and 4,329 BTU for personal trucks.
By the time we have shifted all local human transport to PRT, we may be approaching the limits of what is possible to achieve in terms of motorized, relatively high-speed transport energy efficiency. But to do this we will need massive investment, policy support, and the development of consumer demand. PRT may be an excellent idea, but its implementation is moving at a glacial pace—there’s nothing “rapid” about it.
Far from already having implemented the most efficient transit systems imaginable, we find ourselves today even more dependent on cars and trucks than we were a half-century ago. Moreover, the typical automobile of 2011 is essentially similar to one from 1960: both are mostly made from steel, glass, aluminum, and rubber; both run on gasoline; both have similar basic parts (engine, transmission, gas tank, wheels, seats, body panels, etc.). Granted, today’s car is more energy-efficient and sophisticated—largely because of the incorporation of computerized controls over its various systems. Much the same could be said for modern aircraft, as well as for the electricity grid system, water treatment and delivery systems, farming operations, and heating and cooling systems. Each of these is essentially a computer-assisted, somewhat more efficient version of what was already common two generations ago.
True, the field of home entertainment has seen some amazing technical advances over the past five decades—digital audio and video; the use of lasers to read from and record on CDs and DVDs; flat-screen, HD, and now 3D television; and the move from physical recorded media to distribution of MP3 and other digital recording formats over the Internet. Yet when it comes to how we get our food, water, and power, and how we transport ourselves and our goods, relatively little has changed in any truly fundamental way.
The nearly miraculous developments in semiconductor technologies that have revolutionized computing, communications, and home entertainment during the past few decades have led us to think we’re making much more “progress” than we really are, and that more potential for development in some fields exists than really does. The slowest-moving areas of technology are, understandably, the ones that involve massive infrastructure that is expensive to build and replace. But these are the technologies on which the functioning of our civilization depends.
In fact, rather than showing evidence of great technological advance, our basic energy, water, and transport infrastructure shows signs of senescence, and of vulnerability to Murphy’s law—the maxim that anything that can go wrong, will go wrong. In city after city, water and sewer pipes are aging and need replacement. The same is true of our electricity grids, natural gas pipes, roads, bridges, dams, airport runways, and railroads.
I live in Sonoma County, California, where officials declared last year that 90 percent of county roads will be allowed to deteriorate and gradually return to gravel, simply because there’s no money in the budget to pay for continued repairs. Perhaps someone who lives on a one of these Sonoma County roads will mail-order the latest MacBook Air (a shining aluminum-clad example of Moore’s law) for delivery by UPS—only to be disappointed by the long wait because a delivery truck has broken its axle in a pothole (a dusty example of Murphy’s law).
According to Ken Kirk, executive director of the National Association of Clean Water Agencies, more than 1,000 aging water and sewer systems around the U.S. need urgent upgrades. “Urgent” in this instance means that if infrastructure projects aren’t undertaken now, the ability of many cities to supply drinking water in the years ahead will be threatened. The cost of renovating all these systems is likely to amount to between $500 billion and $1 trillion.
The failure of innovation and new investment to keep up with the decay of existing infrastructure is exemplified also in the fact that the world’s global positioning system (GPS) is headed for disaster. Last year, the U.S. government accountability office (GAO) published a report noting that GPS satellites are wearing down and, if no new investments are made, the accuracy of the positioning system will gradually decline. At some point during the next few decades, the whole system may crash. The GPS system happens to be one of the glowing highlights of recent technological progress. We depend on it not just for piloting Lincoln Navigators across the suburbs, but for guiding tractors through giant cornfields; for mapping, construction, and surveying; for scientific research; for moving troops in battle; and for dispatching emergency response vehicles to their appointed emergencies. How could we have allowed such an important piece of infrastructure to become so vulnerable?
There is one more reason to be skeptical about the capability of technological innovation across a broad range of fields to maintain economic growth, and though I have saved it to the end it is by no means a minor point. As verified in the research of the late Professor Vernon W. Ruttan of the University of Minnesota in his book Is War Necessary for Economic Growth?: Military Procurement and Technology Development, many large-scale technological developments of the past century depended on government support during early stages of research and development (computers, satellites, the Internet) or build-out of infrastructure (highways, airports, and railroads). Ruttan studied six important technologies (the American mass production system, the airplane, space exploration and satellites, computer technology, the Internet, and nuclear power) and found that strategic, large-scale, military-related investments across decades on the part of government significantly helped speed up their development. Ruttan concluded that nuclear technology could not have been developed at all in the absence of large-scale and long-term government investments.
If, in the years ahead, government remains hamstrung by overwhelming levels of debt and declining tax revenues, investment that might lead to major technological innovation and infrastructure build-out is likely to be highly constrained. Which is to say, it probably won’t happen—absent a wartime mobilization of virtually the entire economy.
We’re counting on Moore’s law while setting the stage for Murphy’s.
1. For a discussion of limits to Moore’s law see Brooke Crothers, “Moore’s Law Limit Hit by 2014?” cnet news, posted June 16, 2009.
2. “Current and projected rates of innovation might not be sufficient to improve or even maintain living standards in the face of still rapidly growing population, global warming, and other challenges of the 21st century,” according to Canadian economist James Brander. Barrie McKenna, “Has Innovation Hit a Brick Wall?” The Globe and Mail, December 26, 2010.
3. Martin Lowson, “A New Approach to Sustainable Transport Systems,” presented at the 13th World Clean Air and Environmental Congress, London, August 22-27, 2004; The conversion is: 0.55 MJ = 521.6 BTU; 1.609 km = 1 mi; therefore, 521.6 x 1.609 = 839; U.S. Department of Energy, “Transportation Energy Data Book,” 2007.
4. Rick Jervis, “Pipes, Pumps Trouble Big Easy,” USA Today, posted December 16, 2010.
5. U.S. Government Accountability Office, Global Positioning System: Significant Challenges in Sustaining and Upgrading Widely Used Capabilities, Report No. GAO-09-670T, May 7, 2009.
6. Vernon W. Ruttan, Is War Necessary for Economic Growth?: Military Procurement and Technology Development (New York: Oxford University Press, 2006).