The greatest technological achievement of the last century, according to the National Academy of Engineers, was neither the internet nor the airplane, the artificial heart nor the satellite, the refrigerator nor the assembly line, but that which enabled them all: the electrical grid. There is no small irony in this as contrary to what one may expect, the electrical grid was not meticulously planned and executed but rather cobbled together somewhat haphazardly as utility companies discovered the benefits and efficiencies that could be realized from interconnecting their electrical systems, and over decades it grew into the nationwide network. The electrical grid’s development then was evolutionary, not revolutionary.
Evolution by its very nature is a never-ending series of experiments, some fostering advancement, others impeding it. Interconnection is the prime example of the former while deregulation has proven to be an evolutionary dead end, one consequence of which has been the paradox that the very mechanism that has enabled so much of the technological innovation of the last generation has itself employed so comparatively little of it. Where computers, remote sensors, advanced modeling and myriad electronic devices have transformed every other major industry, the electrical grid effectively remains ‘dumb.’ However, now that the deregulation’s failure has been widely recognized, regulatory uncertainties are being resolved, long-ignored upgrades expedited, and changes that reflect the needs of the marketplace implemented. Combined, the electricity industry is building momentum toward its next evolutionary leap— to an electronically-enabled electric grid delivering digital-quality power.
This shift away from analog mechanical operation will have the most profound effect on the electricity industry in its history. From an investment point of view, the beauty is in the simplicity of the premise. If the US wants to maintain its economic competitiveness as well as its standard of living, both of which are directly correlated to electrical energy consumption, 1 the changes needed to adapt to the demands of the Digital Age must be made. And soon: the US has fallen to seventh place in world rankings of countries positioned to participate in and benefit from information and communication technologies.2
In its most basic form, the transition from analog to digital-quality electricity is a matter of reliability, and realizing the necessary level of service will take a considerable amount of time and involve staggering sums of money, with industry estimates running as high as two trillion dollars over the next two decades.3 To take advantage of the various opportunities associated with the build-out, the Emerging Trends Report (ETR) has put together a broad spectrum ‘best of breed’ approach, which includes various sector-specific benchmarks that must be met in order to remain viable. This is of paramount importance in an industry with more than $800 billion of assets, for the trick to investing in the electrical grid resides in differentiating between the sectors to invest in now, the sectors to monitor for the breakthroughs necessary to make them viable in the years ahead, and the sectors to disregard entirely as pie-in-the-sky hype that will only reward those promoting them.
The Digital Age
The following chart identifies the de facto dawn of the Digital Age. With the crossing of two innocuous lines, 1997 became the watershed year in which the longstanding correlation between economic growth and electricity use ‘uncoupled’ and began to reflect the guiding principle of the Digital Age: increased output from reduced electrical consumption.4
Source: EIA Annual Review 2004
Increasingly permeating all aspects of our daily lives, the icon of the Digital Age is the microprocessor. Perhaps giving new meaning to the word ubiquitous, the proliferation of microprocessors has been nothing short of stunning. From virtually zero thirty years ago, the more than 12 billion microprocessors extant in the US today outnumber computers by 30:1, people by 40:1, 5 and are found in everything from digitally-managed assembly lines to automobiles to household appliances to toy dolls.
Over the last few years, the internet and broadband access have subtly shifted the emphasis from processing speed to storage capacity as myriad digital devices access and share digital music, photographs, video and unimaginable amounts of data. Once the back office orphan, the humble server now congregates in vast populations that enable data retrieval and transference on a planetary scale; in the US alone these server farms now constitute 4% of total electrical demand.6 Electrical consumption has become such an issue for server farms that the likes of Google and Microsoft are now siting server farms on the basis of access to cheap, abundant electricity.7
But in order to operate properly, a microprocessor requires a supply of electricity that is significantly different from the analog, or continuously varying, supply of alternating current that is delivered by the existing electrical grid. Microprocessors must have what is known as ‘digital-quality’ power: a continuous source of electricity free of signal variation. To resolve the conflict between what is available and what is needed, the nearly universal practice has been to employ a rectifier to convert the alternating current delivered by the electrical grid into direct current for use by the digital device. This practice adds yet another layer of waste to an already profligate system. The single largest benefit of the evolution to the “Smart Grid” (see below) and digital-quality power will be the improved efficiency and reliability of the system as a whole.
Consider the wastage inherent to the electrical system bringing the image of this report to your computer screen. Roughly two-thirds of the energy produced to power your computer was lost as waste heat in the centralized generation of electricity; it was simply vented into the atmosphere. Of the remaining third, line losses during transmission and distribution misplaced another roughly seven percent bringing the electricity to your wall outlet. And finally, half of that energy was lost as waste heat converting the 110-volt alternating current to the 12-volt direct current your computer (and countless other digital devices) needs in order to operate free of the even minor power fluctuations that can adversely affect digital circuitry. So in this example for every 100 watts of electricity generated, only about 16 actually get used. Imagine the waste attendant to the server farms of the five largest search engines in the US, which combined are estimated to continuously operate more than 2 million servers.8
That is not to say the Digital Age will entail a shift from alternating current to direct current. Rather, the shift will be from an inefficient, slow, outdated mechanical switching to a focused, faster, more intelligent system employing electronic devices to improve monitoring capabilities and load capacities while smoothing fluctuations and increasing the reliability of the existing alternating current.
Below is a rough comparison of the existing grid and what the future holds:
|Existing Grid||Smart Grid|
|One-way communications (if any)||Two-way communications|
|Built for centralized generation||Accommodates distributed generation|
|Radial topology||Network topology|
|Few sensors||Monitors and sensors throughout|
|Manual restoration||Semi-automated restoration and, eventually, self-healing|
|Somewhat prone to failures and blackouts||Adaptive protection and islanding|
|Check equipment manually||Monitor equipment remotely|
|Limited control over power flows||Pervasive control systems|
|Limited price information||Full price information|
|Few customer choices||Many customer choices|
Source: Center for Smart Energy: “Smart Grid of the Future.”9
Today digital-quality power demand in the US constitutes roughly 10% of total electrical demand; by 2020, the Electric Power Research Institute (EPRI) projects digital-quality power demand will range from between 30 and 50% depending on the industry’s ability to meet accelerating growth.10
Effecting sweeping changes in a mechanism as vast and complicated as the electrical grid to meet this new usage is rather like changing course in a fully laden supertanker. The inertia behind the century-long accumulation of more than 10,000 power plants, 12,000 transmission and distribution substations, and more than 1.3 million miles of electrical lines, 160,000 miles of which are high voltage transmission cable (235 kilovolts or higher), 11 means any change will be slow and ponderous but unstoppable once underway. Such decisions simply cannot be undertaken lightly or implemented quickly. Further complicating the shift, many of the changes that must be made are not technologically feasible at this time.
And before building the Smart Grid, the electric industry must surmount three obstacles of increasing difficulty and complexity, the most serious of which is a result of the Law of Unintended Consequences.
Run to Failure
The first problem regards the nature of the longstanding relationship between the American consumer and analog electrical power. Long profligate in terms of waste, consumers have been willing to accept a lower degree of reliability in exchange for cheap, abundant electricity, and they have yet to be convinced it is in their interest to finance, via higher rates, the transition to digital-quality power.
As in the case of water and sewage treatment, electricity is something that is only noticed in its absence. With various crises and scandals in the industry, as well as electrical rates having been on the rise as a result of increased fuel costs,12 the failure of deregulation to deliver on its promise of lower rates is fresh in consumers’ minds, so opposition to further increases to fund the shift to digital power is understandable.
This makes the challenge facing the North American electrical industry one of helping rate-payers recognize that the fundamental shift in emphasis from the quantity of electricity produced to the quality of electricity delivered will be to their benefit by eventually lowering rates while improving reliability. Working in their favor is the strong growth of convenience- and productivity-enhancing digital technology. From 2001 to 2005, residential information technology use increased more than 250%,13 and consumers are now more aware than ever that devices equipped with microprocessors do not automatically resume operation after a power outage or spike but may well be irreparably damaged. Consequently, the continued growth of microprocessor-enabled devices in itself is serving to highlight the way the existing system is not meeting the new demands of the Digital Age and will in time effect the desired changes in consumer attitudes.
The second obstacle is simply the condition of the electrical grid itself. The range of equipment nearing or beyond its projected service life is staggering: 70% of America’s roughly 160,000 miles of high voltage transmission lines are 25 years or older– as are 70% of the more than 63,000 transformers; further, 60% of the nearly 200,000 circuit breakers are at least 30 years old.14 Electro-mechanical analog switches are still the norm system-wide, which comes as a bit of a surprise considering the same kind of switch was discontinued from use in television sets more than twenty years ago.15 Keeping such increasingly obsolete equipment operating, not to mention finding spare parts, has become such a major problem for the industry that investor-owned utilities
(IOUs) rank service reliability and the condition of electric infrastructure as their two top concerns, with an aging workforce being third.16
Clearly, the US cannot hope to maintain a leadership role in terms of technological innovation hobbled by such an antiquated electrical grid. But even the effort to establish new industry standards for increased efficiency and reliability, which must be determined before the upgrade cycle can begin, have been so mired in bureaucratic and political wrangling that fifteen states finally had to sue the Department of Energy in order to get them to release the first new standard since President Bush took office. Demonstrative of this bickering, the compromise that eventually emerged as the new standard for transformer performance is, even by government evaluation, inferior to other proposals—but it will show a return on investment in a shorter period of time.17
This is the kind of thinking that has led directly to the most difficult problem needing to be resolved: repairing the damage wrought by the deregulation of the wholesale electric energy market. Fifteen years of deregulation have contributed significantly to the systematic neglect outlined in the preceding paragraphs. Ironically, though deregulation coincided with the dawn of the Digital Age, it effectively sacrificed reliability on the altar of profits.
Managing the electrical grid on the basis of short term profit (and often substantial bonuses for corporate officers) rather than long term common good has decimated the industry. Operational procedure became to run equipment until it failed. Employment has been cut by 27%18 and roughly 40% of the remaining skilled workforce will be eligible to retire within about four years.19 Budgets for maintenance and tree-trimming, the cause of one out of every six minutes lost to power outage20 as well as the cause of the massive 2003 blackout in the northeast, have been slashed– as have those for training programs. Entire system planning departments have been dissolved, and US research and development lags that of most developed countries by a considerable margin. 21
Deregulation inadvertently twisted the same interconnection between adjoining systems that once fostered cooperation toward the common good of insuring reliability into a mechanism for transferring electricity cross-country in a parody of free market competition. It simultaneously grossly overloaded the transmission system and discouraged investment in the expansion needed to alleviate the very congestion it was causing.
Historically, 95% of power outages are transmission-related.22 Due in large part to regulatory uncertainties regarding the return on investment for the construction of new transmission lines, capital expenditures since 1990 have plummeted 30% at the same time electrical demand has increased 25%.23 Amortization, or depreciation, rates have exceeded construction expenditures every year since 1995.24
Mandating that vertically integrated utilities facilitate the wholesale bulk transfer of electricity between multi-regional markets by third parties has resulted in the system being used daily in ways it was simply not designed to function. For example, bulk power transactions on the Tennessee Valley Authority system exploded from less than 20,000 in 1996 to more than 250,000 by the end of 2001.25
Increased bulk power transactions have led to a substantial drop in capacity margin, which is essentially the reserve of electrical power available to meet changes in demand at any given time. From a range between 30-40% in the 1980’s, margin capacity has fallen to less than 16% in 2005,26 which provides little room either for growth or to maneuver in times of crisis. Overloading the transmission system has become the norm—as has an increase in the number of delivery constrictions that resulted in monetary loss, which are known as Loading Relief Events:
Transmission Loading Relief Events, 1997-2004
[EB Ed: Sorry for the illegible labels. The bars should be labelled 1997, 1998 … 2004]
Source: North American Reliability Council (NERC)
Nowhere is the failure to meet the demands of digital-quality power more evident than in the explosion in so-called reliability costs. Electrical outage costs, which have historically averaged about $20 billion per year, have been running closer to $100 billion for the last few years,27 and adding in newly significant momentary interruption costs, which necessitate rebooting systems and thereby stalling work forces, brings the total loss to in excess of $130 billion annually.28
As a result of all this, the US electrical industry now ranks among the worst of developed nations in terms of service reliability. This has become such an issue that one out of every six dollars invested in generation and delivery equipment nationwide now goes for emergency backup power.29 More startling was Allied Business Report’s claim in 2000 that lack of confidence in service reliability had driven so many companies to generating their own power that it accounted for as much as 10% of total US generation—and that figure was set to grow 15% annually.30 This sentiment is increasingly shared by US households: in 2006, 51% of those polled were planning to purchase a backup generator in the next two years, and 47% were very interested in having base load capacity in order to insure reliability of supply.31
The cause and extent of such dissatisfaction is easily seen on the following chart which shows the amount of time lost each year to system outages not related to weather events:
Source: Apt, J. et al: Feb, 200632 Chart: FRPitt/ETR
It is worth noting that where a resident of Japan loses power roughly once every twenty years, Americans lose power once every nine months.33
Perhaps most troubling of all, deregulation with fixed retail pricing shifted the risks to investors, where previously under rate-of-return regulation, the risks were borne by ratepayers.34 This shift combined with questionable management practices such as the aforementioned have combined to effectively run many previously sound companies into the ground. This is reflected in the median bond rating of investor-owned utilities, which was “A” before deregulation, falling three grades to “BBB” in 2005. Credit ratings for independent power producers and energy traders are considerably worse. Combined, the drop in industry credit ratings translates into higher financing costs going forward, which will certainly be passed on to consumers.
The following chart of credit-quality does not present the picture of a healthy industry well-positioned to respond to what will undoubtedly be the largest concerted restructuring in its history.
Sources: Brattle Group. 35 S&P ratings as reported by Compustat: the sample consists of 121 companies based on Compustat’s GICS codes for utilities and multi-utilities.
The upshot of all of this is that deregulation didn’t deliver lower prices, just lower reliability.
Putting finance and marketing personnel in charge of a mechanism traditionally operated by engineers has been akin to letting a bunch of hooligans (think Enron) take the family car for a joyride: the surprise is not that the vehicle breaks down but that it did not break sooner, which in itself is testimony to the build quality of the grid.
…and on that sound foundation a new grid will rise
The build-out of an improved electrical grid and transition to the delivery of digital-quality power is a foregone conclusion—there is simply no going back. Everyday modern conveniences as well as our continued ability to compete globally dictate that it is not a question of if the build-out will occur but simply its rate of development. But make no mistake: the value of electrical service to users in the US is in the range of one hundred times prices paid,36 and this relationship only stands to improve in the Digital Age, which means barring utter economic collapse there is little chance of these changes not being implemented.
The forces for change are gathering momentum. In a pragmatic work-around, equipment is being designed that will surpass whatever standards the DOE finally endorses. Research, especially into open standards and so-called plug-and-play flexibility, has been accelerating. And because deregulation can never be publicly recognized as a failure, new legislation assuming the right of eminent domain is attempting an end run around consumer advocacy groups, NIMBY and environmental opposition by providing the power to designate National Transmission Corridors that will speed transmission relief to the most congested areas, namely the northeast metropolitan areas and southern California.37 More will follow, thereby providing job security for a generation of lawyers.
But it also means electricity rates nationwide, which have long been considered a bargain, are going to increase substantially in the years ahead. The rate increases will initially be used to resolve transmission and decrepit equipment problems. Subsequent increases will reflect the gradual but widespread adoption of power electronics to upgrade the intelligence, reliability and capacity of the grid. And by 2015 the generation capacity under construction now will be coming online, insuring higher rates decades into the future.
Far from being depressed by the prospect of increased rates though, long term investors should rejoice, for this process will literally take decades and involve at least two overlapping, if not simultaneous, iterations before nearing the next evolutionary plateau. As EPRI puts it, “The effort is not a centralized, top-down makeover, but rather a distributed, bottom-up transformation created by individual companies adding advanced capabilities piece by piece on the existing grid.”38
And that leads us to what tomorrow’s grid will look like and how to profit from the build-out– without being duped by the boondoggles.
In the following sections for subscribers, the Emerging Trends Report continues its assessment of the electrical industry with sections detailing:
- the next two iterations of the electrical grid– and what may never come to pass;
- developments in the search for the electric industry’s Holy Grail, and what finding it will mean;
- where to invest today, what to monitor for investing tomorrow, and what to stay away from entirely;
- our investment approach and stock recommendations;
- and our substantial Sources/Further Reading section.
To purchase either this 30-page report or an annual subscription to our service, please visit our website at www.emergingtrendsreport.com.