Skip to main content

The 2003 Northeast Blackout--Five Years Later

Tougher regulatory measures are in place, but we're still a long way from a "smart" power grid

On August 14, 2003, shortly after 2 P.M. Eastern Daylight Time, a high-voltage power line in northern Ohio brushed against some overgrown trees and shut down—a fault, as it's known in the power industry. The line had softened under the heat of the high current coursing through it. Normally, the problem would have tripped an alarm in the control room of FirstEnergy Corporation, an Ohio-based utility company, but the alarm system failed.

Over the next hour and a half, as system operators tried to understand what was happening, three other lines sagged into trees and switched off, forcing other power lines to shoulder an extra burden. Overtaxed, they cut out by 4:05 P.M., tripping a cascade of failures throughout southeastern Canada and eight northeastern states.

All told, 50 million people lost power for up to two days in the biggest blackout in North American history. The event contributed to at least 11 deaths and cost an estimated $6 billion.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


So, five years later, are we still at risk for a massive blackout?

In February 2004, after a three-month investigation, the U.S.–Canada Power System Outage Task Force concluded that a combination of human error and equipment failures had caused the blackout. The group's final report made a sweeping set of 46 recommendations to reduce the risk of future widespread blackouts. First on the list was making industry reliability standards mandatory and legally enforceable.

Prior to the blackout, the North American Electricity Reliability Council (NERC) set voluntary standards. In the wake of the blackout report, Congress passed the Energy Policy Act of 2005, which expanded the role of the Federal Energy Regulatory Commission (FERC) by requiring it to solicit, approve and enforce new reliability standards from NERC, now the North American Electricity Reliability Corporation.

FERC has so far approved 96 new reliability standards.* These cover the three Ts—"trees, training and tools"—identified by the blackout task force but are not limited to them, says Joseph McClelland, director of FERC's Office of Electric Reliability, which was established last September. Standard PER-003, for example, requires that operating personnel have at least the minimum training needed to recognize and deal with critical events in the grid; standard FAC-003 makes it mandatory to keep trees clear of transmission lines; standard TOP-002-1 requires that that grid operating systems be able to survive a power line fault or any other single failure, no matter how severe. FERC can impose fines of up to a million dollars a day for an infraction, depending on its flagrancy and the risk incurred.

If the standards have reduced the number of blackouts, the evidence has yet to bear it out. A study of NERC blackout data by researchers at Carnegie Mellon University in Pittsburgh found that the frequency of blackouts affecting more than 50,000 people has held fairly constant at about 12 per year from 1984 to 2006. Co-author Paul Hines, now assistant professor of engineering at the University of Vermont in Burlington, says current statistics indicate that a 2003-level blackout will occur every 25 years.

He says many researchers believe that cascading blackouts may be inherent in the grid's complexity, but he still sees room for improvement. "I think we can definitely make it less frequent than once every 25 years."

The U.S. power grid consists of three loosely connected parts, referred to as interconnections: eastern, western and Texas. Within each, high-voltage power lines transmit electricity from generating sources such as coal or hydroelectric plants to local utilities that distribute power to homes and businesses, where lights, refrigerators, computers and myriad other "loads" tap that energy.

Because electricity in power lines cannot be stored, generation and load have to match up at all times or the grid enters blackout territory. That can result from a lack of generating capacity—the cause of the 2000 California blackouts—or because of one or more faults, as in the 2003 blackout. The interconnectedness of the grid makes it easier to compensate for local variations in load and generation but it also gives blackouts a wider channel over which to spread.

Transmission system operators scattered across some 300 control centers nationwide monitor voltage and current data from SCADA (supervisory control and data acquisition) systems placed at transformers, generators and other critical points. Power engineers monitor the data looking for signs of trouble and, ideally, communicate with one another to stay abreast of important changes.

One of the realizations since 2003 is that "you can't just look at your system. You've got to look at how your system affects your neighbors and vice versa," says Arshad Mansoor, vice president of power delivery and utilization with the Electric Power Research Institute of Palo Alto, Calif.

Until recently, there was no one place to view information from across the grid. McClelland says FERC is working with industry and other government agencies to pull data into a prototype coast-to-coast real-time monitoring system at its Washington, D.C., headquarters. "We have put the system together and it is functional," he says, although "some parts are better than others": FERC has full coverage of the western U.S. and good information from the Southeast, he says, but data from Texas and other areas is still spotty.
Gathering the data is only the beginning. The holy grail is a smart grid capable of monitoring and repairing itself, similar to the way air traffic control systems are used to coordinate aircraft routes. Mansoor says that dream is still a good 20 years away because it depends on better data, a reliable communications network and computer programs capable of making decisions based on the data.

One promising tool for collecting better data is called a phasor measurement unit (PMU), which measures voltage and current on power lines and uses GPS (global positioning system) connections to time-stamp its data down to the microsecond. That level of resolution across a network of PMUs could reveal an important electrical property of power lines called phase, which tells whether power generators are rotating in sync with respect to one another, Hines says.

When a blackout approaches, that difference, called the phase, is believed to grow rapidly. "A lot of people have conjectured that if we could have seen that the [phase] distance between generators was increasing [on August 14, 2003], we could have prevented the blackout," Hines says.

There are currently about 100 PMUs installed in the eastern interconnection, up from zero in 2003, as part of the North American SynchroPhasor Initiative based at the Pacific Northwest National Laboratory in Richland, Wash. "We still need a couple of hundred more [PMUs] to get a full coverage," Mansoor says, but he adds that they are already helping local utilities diagnose the causes of blackouts much faster than they could before.

Another challenge for keeping the grid balanced is the growing demand for electricity—increasing load, in other words—as consumers buy more computers, air conditioners and rechargeable handhelds. The U.S. Department of Energy's Energy Information Administration projects a load growth of 1.05 percent a year from now until 2030, which means transmission capacity will have to keep pace.

The main obstacle to building new transmission lines is siting, better known as the "not in my backyard" effect: Nobody wants power lines near them. One potential way of getting around that is so-called smart metering—hourly readouts of electricity usage that allow utilities to offer price discounts on power during off-peak times. Pilot smart-metering programs are under way in Idaho, California and other states.

Mansoor notes that advanced metering tools might become useful given the potential for increasingly intermittent power sources. Wind power, for example, stops and starts with the breeze, which means system operators would have to adjust the load to compensate. Although wind energy accounts for 19.5 gigawatts of power in the U.S., or less than 2 percent of total power generation, it represented 35 percent of new generating capacity installed in 2007, up from 5 percent in 2003.

An alternative to power lines in cities and other urban areas is power cables based on high-temperature superconductor (HTS) technology. When chilled to –321 degrees Fahrenheit (77 kelvins, or –196 degrees Celsius) the composite material yttrium barium copper oxide begins to carry a current with almost zero resistance. HTS power cables can therefore be made smaller than the copper kind.

In a concept called the secure supergrid,  would bolster existing transmission lines and would resist the stresses that can cause blackouts, because the lines shut down when the current spikes (reflecting the "almost" in an HTS cable's "almost zero resistance"). Some researchers have proposed combining an HTS supergrid with a coast-to-coast hydrogen pipeline to suppy fuel cells for cars and homes.

The Long Island Power Authority switched on a $50-million, 69-kilovolt HTS system in April to supply power to up to 300,000 homes. Consolidated Edison Company of New York and the U.S. Department of Homeland Security have commissioned cables for a $40-million supergrid system in downtown Manhattan known as Project Hydra, scheduled for operation in 2010.

None of these tools would guarantee the extinction of large blackouts. When researchers study very complex systems, whether they be power grids or sandpiles, they often find a simple relationship: The frequency of larger and larger catastrophes—such as blackouts or avalanches—remains relatively high. "If you look at all the steps that have been taken since 2003, I think overall the risk is less today than it was in 2003," Mansoor says. "But the risk is always there."

*Correction (8/14/08): This article originally stated that FERC has approved 83 new reliability standards; that number refers to the first standards to take effect in June 18, 2007.