Nuclear Power: What Went Wrong?
WHEN NUCLEAR POWER WAS FIRST PROPOSED IN THE l940S , it seemed like a gift from heaven: a cheap, clean, and exhaustible source of electricity. In the decades since, scientific research and political developments have brought new advantages to the fore: Nuclear power generates no greenhouse gases, and its fuel does not come from countries of doubtful stability. Yet despite all these selling points, no new American nuclear plant has been ordered in nearly 25 years. In the popular imagination, nuclear power is associated with a host of problems, from releases of radioactivity and the threat of meltdowns to the question of waste disposal. Fear of terrorist bombing is the latest source of anxiety.
Did it have to happen this way? Even if early proponents sometimes overstated their case, one can imagine nuclear power providing a much larger share of America’s electricity needs today, as it does in France (see sidebar “How the French Do It,” page 52). The technology has its flaws, to be sure, but the French have shown that they are not insurmountable. Why did nuclear power fail to achieve its potential in America?
One important reason is that the technology was pushed too quickly. The outlook was very bright on Labor Day 1954 in Shippingport, Pennsylvania (near Pittsburgh). President Dwight D. Eisenhower started a bulldozer that broke ground for the nation’s first commercial nuclear power plant. It was an exercise in beating swords into plowshares, for not only had nuclear technology grown out of the wartime Manhattan Project, but the reactor at Shippingport would be of a type that Westinghouse was building for the Navy’s nuclear submarines. In a speech, Lewis Strauss, chairman of the Atomic Energy Commission (AEC), spoke of a day when electricity from the atom would be—in a phrase that was later quoted with irony—“too cheap to meter.” As with today’s Internet services, customers would pay a low fixed monthly fee and use as many kilowatts as they wished.
The road to Shippingport began in Idaho, where an experimental reactor had produced 100 kilowatts of electricity in 1951. In 1954 Congress endorsed Eisenhower’s Atoms for Peace program by allowing private companies to own nuclear reactors. The AEC launched a Power Reactor Demonstration Program (PRDP), which offered financial assistance and research-and-development support to such firms. Westinghouse and General Electric, which were building reactors for the Navy, announced that they were ready to construct civilian versions as well. During 1955 and 1956 the AEC broadly declassified the technology for such power reactors.
Utility executives responded with caution. They were cranking out their megawatts quite happily using hydroelectric dams and coal-fired boilers, and they had little desire to embrace this new and arcane technical realm. However, they also felt the hot breath of Washington. Barely 20 years earlier, President Franklin D. Roosevelt had rocked the electrical industry by setting up the Tennessee Valley Authority, a federally owned utility that sought to demonstrate low costs and effective management. It was easy to imagine a nuclear TVA, a new federal initiative that might limit future opportunities for growth in the private sector. This prospect grew stronger when the AEC built a 5-MW demonstration reactor at its Argonne National Laboratory, near Chicago.
The pioneer among private firms was the Duquesne Light Company, based in Pittsburgh. That famously sooty town had recently enacted smoke-abatement restrictions and other pollution controls, and nuclear power looked like a way around the new rules. Even before the PRDP was announced, Duquesne made its commitment. With the AEC paying most of the bill, the company purchased a modified naval reactor. It was rated at 60 MW, enough in those days for a city of 100,000 people, and it entered service at Shippingport in 1957.
THE PRDP NEXT PERSUADED THREE OTHER UTILITIES TO build their own reactors. These included the 160-MW Yankee Rowe plant, built in western Massachusetts by a consortium of New England power companies; it launched a move toward larger sizes. Other utilities worked outside the PRDP, believing that only the discipline that comes with paying one’s own bills could reveal whether nuclear power would be a paying proposition. The unsubsidized pioneers included Chicago’s Commonwealth Edison, with its 180-MW Dresden plant, and New York’s Consolidated Edison, with its 163-MW Indian Point facility.
Early results were mixed. As expected, the cost of nuclear fuel proved to be very low. A few tons would last a year. This contrasted with the continuing demands of a coal-burning plant for endless trainloads of fuel. (The same applied to plants running on oil and natural gas, though the great majority of fossil fuel plants were coal-fired, as they still are today.) In service, the nuclear reactors were dependable and largely troublefree. They even accommodated operation at modestly increased power levels. However, in a sign of things to come, several of these installations sustained large cost overruns during construction. This was bad, for they were built with borrowed money. Meanwhile, coal companies were cutting the price of their product by shipping it in dedicated “unit trains” to reduce transportation costs.
Even so, nuclear power was about to break through, and much of the reason involved a dramatic surge in demand for electricity. In only 20 years, from 1955 to 1975, the nation’s installed electrical generating capacity would increase more than fourfold, from 102,000 MW to 480,000. The utility industry had been a staid and conservative establishment, regulated by state boards and relying on coal and hydro technologies that dated from the turn of the century. Rapid growth now made it a go-go field, like aerospace and electronics. This transformation encouraged boldness, as executives turned to coal and nuclear plants of unprecedented size.
Still, with commercial reactors overrunning their budgets amid a flood of cheap coal, the AEC had to offer additional incentives. It did so in 1962 with a new round of PRDP funding. In response, the Connecticut Yankee utility group announced plans for a 490-MW installation near Hartford, and the Los Angeles Department of Water and Power placed orders for two of them, at Malibu and San Onofre. The Malibu project lay close to an earthquake fault and in time was abandoned, but San Onofre was completed.
This activity broke a seven-year dry spell for the nuclear industry, during which the only orders for plants of substantial size had come from overseas. Coal was more expensive in other countries, while atomic power benefited from their governments’ support. Even so, it was clear that most domestic utilities would order nukes only with financial sweeteners. The AEC was doing its part with the new PRDP, and General Electric saw that it could help the infant industry as well. This corporate behemoth, which had vastly greater financial strength than any utility, could offer what amounted to loss leaders. It would design and sell plants at a fixed price, accept the prospect of overruns, lose a great deal of money—and still position itself to take the lead in nuclear energy if widespread demand materialized.
In December 1963 Jersey Central Power and Light ordered a 650-MW plant at Oyster Creek from GE at a fixed price of $66 million. If GE’s Oyster Creek estimates were valid, nuclear power might be ready to compete with coal. But other utilities still held back. Some of them obtained bids from reactor manufacturers merely to provide leverage in negotiating with coal suppliers. During 1965, though, coal executives hardened their position. They stopped cutting their rates, for sales were expanding rapidly—another consequence of the nation’s soaring demand for kilowatts.
Coal faced problems of its own. The plants of the day burned high-sulfur varieties that sent copious quantities of sulfur oxides up the smokestack. It was possible to remove these pollutants with stack-gas scrubbers, but they were costly. And Americans were beginning to call for action in the fight for cleaner air. If scrubbers were made mandatory (which finally happened in 1977), they would bring a marked increase in the cost of a coalfired plant. Indeed, such a plant with scrubbers might be more expensive than a nuke.
The upshot was that nuclear energy made its breakthrough. During 1966 and 1967 some 50 nukes were placed on order; they represented close to half the new capacity being purchased by U.S. utilities, and their combined power ratings came to a little more than 41,000 MW—as much as the entire nation had had in service at the start of World War II. Yet the industry’s burgeoning prosperity would eventually cause its downfall, for under the press of surging consumer demand, reactor design moved well ahead of operating experience.
Typical unit sizes jumped from 500 to 800 MW, at a time when the largest plant already in service was Dresden, in Illinois, which had been upgraded to 200 MW. The first big reactor, the 436-MW installation at San Onofre, did not reach full power until December 1967, and by then units approaching 1,100 MW were on order. James Jasper, an environmentalist writer, notes that “at no time from 1962 to 1972 were there any plants in operation as large as the smallest of those being ordered.”
To be sure, nuclear reactors of all sizes use the same basic principles in their operation. But the massive core of a 1,000-megawatt reactor proved to introduce difficult and costly issues of safety. By contrast, coal-plant designs could be changed virtually at will, because such plants were not capable of releasing massive amounts of radiation.
The big new reactors were built to address a particular difficulty related to their size: emergency core cooling. This became the make-or-break issue for nuclear power. Despite lurid public fears, a reactor could not explode like an atomic bomb, and fires, floods, earthquakes, and releases of radioactivity all were manageable risks. Core cooling, however, would turn out to be the industry’s greatest and most persistent safety concern.
Even after being shut down, reactors continue to generate heat. This “afterheat” is unavoidable; it comes from the intense but short-lived radioactivity of newly formed elements produced by the fissioning of uranium. In small, early reactors, afterheat had caused few difficulties. It melted the uraniumfilled core of a test reactor during a 1955 experiment, but this brought no permanent loss. Scientists fabricated a new core, lowered it into position, and put the reactor back in use.
BUT THE 1,000-MEGAWATT TYPES HAD CORES THAT weighed several hundred tons, and the meltdown of such a core threatened far more serious consequences. The solution lay in the emergency core cooling system (ECCS). This was an array of piping, pumps, and valves that could continue to inject water into an overheating reactor. Big reactors had several such systems to provide backup in case of failure. They were built to keep the water flowing for days, while the afterheat gradually died down. The new ECCS designs had not been fully tested, but this did not ring alarm bells at first. Any problems, it was thought, could be fixed as they arose.
The growing popularity of air-conditioning brought a new surge in demand for electricity from 1970 through 1974. Utilities ordered 126 new reactors, net of cancellations, with a total capacity of 145,000 MW. During 1974 the industry reached a peak, with 223 units in service, on order, or under construction. The first 1,000-megawatt plants entered service during 1973 and 1974, though in the beginning they were held below that level to ensure safety.
Early experience suggested that while nukes cost more to build than coal-fired units, they saved money on fuel and so delivered their power at a lower running cost per kilowatt. But the nagging question of ECCS design persisted. As early as 1966 the AEC had identified emergency core cooling as its highest safety priority. It announced plans to build a reactor that would serve as an ECCS test-bed, evaluating such cooling systems with a real overheating core. That program encountered delays, but other studies went ahead. In 1971 calculations suggested that a hot core might produce steam at a pressure high enough to keep the ECCS water from entering the hottest areas. The water would simply bypass the high-pressure region, allowing a meltdown to proceed. Here was an important warning that reactor design was outrunning the designers’ understanding of the safety issues involved.
While the AEC had the legal authority to ensure safety by issuing regulations, its commissioners were a can-do group that tended to dismiss any suggestion that reactors were unsafe. One reason the ECCS test-bed reactor was unavailable was that some of its funding had been diverted to more glamorous projects; the agency preferred to work on new and advanced reactor designs instead of refining old ones. The regulatory staff remained small through the 1960s. In 1969, three years after the issue had come to the forefront, the AEC had only three people evaluating ECCS designs for the new reactors being ordered. In 1971 Milton Shaw, the AEC’s director of reactor safety, told a reporter for Science that “at the drop of a hat I can spell out fifteen areas where we could do more research in reactor safety. Drop two hats and I’ll spell out thirty areas. There’s virtually no limit on the work we can do.”
As concern rose and the antinuclear movement grew, the AEC responded by quadrupling the size of its regulatory staff. Emergency cooling was one reason; another was the sheer number of new reactors on order, many of which had unique features. This commitment to regulation gained further strength in 1975, as the AEC split into two new agencies. Since its founding in 1946, the commission had worked both to promote the growth of nuclear power and to regulate it, but this dual role posed a clear conflict of interest. Under the new arrangement, safety went to the Nuclear Regulatory Commission (NRC) and reactor programs went to the Energy Research and Development Administration (ERDA).
The year 1974 was the last good one for the industry, with 20 new plants, net of cancellations, placed on order. The previous fall had brought the first energy crisis, as oil-producing nations in the Middle East cut the flow of petroleum and boosted its price. This caused a burst of inflation along with a recession, sharply slowing the growth in demand for electric power. To reduce their capital outlays, utilities canceled a number of orders for both coal and nuclear plants and stretched out the construction schedules of others. The high cost of building nuclear plants made them particularly attractive candidates for cancellation.
Coal-fired plants had also grown more expensive, but nukes took an even greater cost increase because of the new regulations. The federal agencies were required by law to pursue safety without regard to cost. This led them repeatedly to insist on design changes in units already under construction and to require the retrofitting of new equipment. Directives were often as thick as telephone books, with compliance being mandatory. They came forth in a steady stream; one firm counted 145 such regulations issued during the 1970s. Builders of new plants thus found themselves facing costly delays as they struggled to hit a moving regulatory target.
The ECCS remained at the top of the agenda. In 1975 some 90 percent of all money for research on power reactors was committed to this topic, while other safety questions, such as seismic stability, received attention too. The NRC adopted stringent standards for pipes, valves, and the integrity of welds. It also showed concern about reactor domes or containment structures, which provided a last line of defense in case of a major accident. If an ECCS proved inadequate, water might reach the superhot core and flash into steam. If the dome was not strong enough, the resulting explosion might blow it open, producing a plume of gases that could spread radio-activity over a wide area. When the NRC ruled in 1975 that one of GE’s containment structures was defective, it took the company six years to get a modified design approved.
WHILE THE TIME required to build a coal-fired plant held steady at around 5 years, the time for a nuke went from 5.5 to 8.5 years between 1967 and 1974. This brought substantially greater expense for interest and provided more time for inflation to take its toll. For nuclear power plants ordered during 1974, interest plus inflation came to nearly half the total average cost of $730 million.
The nuclear industry received a tidbit of good news in 1978, as work with the ECCS test-bed reactor reached completion. In all cases the ECCS performed better than expected. It held temperatures in the core below calculated estimates and thereby eased concerns about the adequacy both of the emergency cooling systems themselves and of the computer models used in their design.
That same year brought a surge in new power-plant orders—but not for nuclear plants. Construction and regulatory costs had increased in tandem with public concern, making most utilities decide that they simply weren’t worth the trouble. The nation’s power companies ordered 32 large generating units during 1978, and only 2 of them were nuclear. These came from Chicago’s Commonwealth Edison, builder of the original Dresden plant, which held a strong ongoing commitment to atomic power. (To this day Illinois has more nuclear plants than any other state.) Even so, the contracts with Westinghouse specified that Commonwealth Edison could withdraw on payment of a negligible cancellation fee, which it did in 1988.
That was the state of the nuclear industry at 4:00 A.M. on March 28, 1979. At that moment, maintenance workers were cleaning sludge from a small pipe at the Three Mile Island nuclear plant near Harrisburg, Pennsylvania. Inadvertently, they blocked the flow of water in the main system. The reactor and its steam turbines shut down automatically as loudspeakers broadcast a warning: “Turbine trip! Reactor trip!”
The reactor was full of water, and the afterheat flashed some of it into steam, producing a pressure surge that popped a relief valve. The valve opened properly, the steam escaped, and three emergency pumps started up to restore the flow of water. At that point all should have been well.
However, water from those pumps had to pass through two valves to reach the reactor core, and someone had left those valves closed. The relief valve, though, remained open. Liquid water began to flow out of the reactor vessel at the rate of 220 gallons per minute. The operators were not aware that this valve was open. One instrument gave a faulty reading, indicating that the core was still full of water. Technicians responded by shutting down other emergency pumps.
Now the reactor lacked both its normal and its emergency sources of cooling. Its temperature continued to rise. Warning horns blared in the control room, as many as 50 at once. The station manager arrived three hours later and declared a general emergency. By then the afterheat had begun to melt the core.
Growing problems with instruments made it difficult both to learn the state of the core and to learn whether corrective measures were having the desired effects. It took 16 hours before the operators succeeded in restoring a flow of cooling water. They re-established an appropriate water level in the core, and the next day a spokesman declared that the emergency had ended: “There was nothing that was catastrophic or unplanned for.” Actually, there was plenty. The partial meltdown had promoted chemical reactions in which steam oxidized the hot zirconium tubes that held the nuclear fuel, releasing a substantial amount of hydrogen at high pressure. This gas collected at the top of the vessel that held the crippled nuclear reactor. No one had anticipated this. It raised the danger that air might mix with the hydrogen, forming an explosive combination. If it detonated, it could rip open the containment structure and blast huge quantities of radioactivity into the surrounding area.
LOCAL RESIDENTS WERE STILL DIGESTING THE SPOKESMAN’S reassurances when Gov. Richard Thornburgh responded to this new and serious hazard. He invited pregnant women and children to leave the area, and 140,000 people did just that. Far from being under control, the damaged plant now seemed to threaten the entire Harrisburg area.
No one died at Three Mile Island; in fact, no one was injured. Technicians managed to bring the hydrogen under control by circulating hot water to dissolve it. The accident nonetheless spread fear across the nation and won the attention of President Jimmy Carter, who had studied nuclear engineering at the U.S. Naval Academy. He set up a review panel headed by John Kemeny, the president of Dartmouth College.
The committee issued its report in October 1979, and its conclusions were scathing. It directed particular fire against the Nuclear Regulatory Commission, asserting that far from overregulating the industry, it had been asleep at the switch. The report declared that “the evidence suggests that the NRC sometimes erred on the side of the industry’s convenience rather than carrying out its primary mission of assuring safety.” It warned of what it saw as a widespread attitude of complacency toward nuclear power plants and called for “fundamental changes” in the “organization, procedures, and practices—and above all—in the attitudes” of both the NRC and the industry.
Kemeny’s group found that panels in the Three Mile Island control room contained hundreds of indicator lights, which followed no fixed rule when signaling an abnormal condition; depending on the light, it might be red, green, amber, or white. Important gauges were hidden from view on the backs of consoles, and the most significant displays were scattered rather than grouped where they all could be read at once.
In fact, this sort of design deficiency had caused the accident. It started with a relief valve that remained open, spilling water. For hours operators thought it was closed, but its indicator light showed merely that it had been commanded to close—not that it actually had closed. There was no way to doublecheck the valve’s status.
Other flaws existed in the reactor itself. A key valve could not be opened by remote control but had to be handled manually. Main pumps shut down because they were incapacitated by steam from the overheating core; these pumps had never been tested for such a condition. The buildup of hydrogen accelerated the overheating by blocking the circulation of cooling water, and the system lacked gas valves that could have allowed the hydrogen to escape. Important pipes ran uphill, preventing trapped water from flowing to the core.
Facing these design deficiencies and Kemeny’s calls for greater regulation, the NRC responded with a burst of additional directives. With this, and amid greatly increased public hostility toward nuclear power, the market for new plants collapsed totally. The nuclear industry had remained busy even after 1974, for although utilities canceled 27 orders during the subsequent three years, they placed 11 new ones. As late as 1977 there were 4 new orders. But the Commonwealth Edison quasi-orders of 1978 were the industry’s last.
In 1980, amid a rush of post-Three Mile Island regulations, the projected time to build a nuclear plant increased to 12 years. That same year, with inflation in double digits and the prime rate exceeding 21 percent, the industry had a backlog of 130 plants already ordered or under construction. The new regulatory climate brought sharp increases in the cost of labor and equipment, which more than doubled between 1979 and 1983 (not including inflation, which was counted separately). The cost of labor took a particularly steep upward swing because projects now required more workers: engineers, craft workers, field supervisors, and quality-control specialists.
Coal-fired plants could be built to the construction standards of a chemical plant or an oil refinery, but nukes needed much more. Conventional good practice in welding, as on a natural gas pipeline, was no longer adequate. Instead, welds had to be meticulously certified using X-rays or ultrasound, and if the proofs of quality were not in order, the work might have to be ripped out and redone. Pipes, valves, pumps, and other components faced their own high (and costly) standards for quality and reliability.
As utilities learned just how hazardous nuclear power could be to their financial health, many of them threw in the towel. They canceled 58 plants between 1979 and 1985, including 28 already under construction. Every plant ordered after 1974 ended up being dropped. But some did begin operation, with 18 new units entering service from 1983 to 1985. Five years later the final active projects limped to completion, bringing the total to 108 units in 1990. By then the oldest and smallest plants were obsolete. Duquesne, dating from 1957, shut down in 1982, and others of its era followed. The number in service has dropped modestly since, falling to 103 in 2001. This is less than half the 223 that were in prospect at the industry’s zenith in 1974.
What went wrong? The industry’s brief nuclear summer during the 1960s and 1970s was based on a promise of costeffectiveness that proved to be illusory. This resulted from too rapid growth with too little attention to safety. The ECCS issue should have been resolved earlier; the AEC’s test-bed reactor might have finished its job considerably sooner than 1978. Yet the test data, when available, did not bring a return to the days of low-cost nuclear enthusiasm. The trouble ran deeper.
Three Mile Island did not destroy a healthy industry. Rather, it accelerated trends in regulation, and the attendant costs of compliance, that had already laid the industry low. It did not vindicate the antinuclear movement by proving that nuclear power was unacceptably dangerous; if it had, there would not be more than 100 nuclear plants in operation today. Rather, it disclosed flaws in design, particularly in the critical yet often overlooked area of presenting information to operators. These flaws allowed what should have been a minor and easily handled problem to spiral into an accident that seemed to threaten the entire Harrisburg area. The Soviet Union’s 1986 Chernobyl meltdown and the growing concern over the disposal of nuclear waste were additional nails in nuclear power’s coffin.
Still, for an industry that died more than 20 years ago, nuclear power shows remarkable vitality, holding steady at 18 to 20 percent of U.S. power generation. Three trends are likely to shape its destiny in the decades to come. The first of these is a concentration of plant ownership within fewer but stronger companies. These big corporations, such as Duke Energy, Exelon, and Entergy, have raised standards by employing large professional staffs, pooling experience with a number of units, and applying the best practices widely.
RISING STANDARDS HAVE PROMOTED ANOTHER TREND, whereby nuclear reactors are pumping out more power than ever before. Since Three Mile Island about 50 new units have come on stream, and they have been quite large. More important, the entire fleet has steadily improved its time in use. The reactors of 1980 typically were down for more than five months of the year, often for replacement of failed equipment. The plants of 2000 set a record by being down for an average of only six weeks, exclusive of shutdowns for regularly scheduled maintenance and refueling. In consequence, the nuclear fleet turned out 769 billion kWh—three times as much as it had produced 20 years earlier, and more nuclear energy than any other country generates.
Today’s plants are also likely to stay in service for many decades into the future. They were originally licensed for 40 years of operation. However, the Calvert Cliffs and Oconee stations, with a total of five reactors in Maryland and South Carolina, have recently had their licenses renewed for an additional 20 years. Richard Meserve, chairman of the NRC, expects that the vast majority of today’s plants will also receive extensions.
Some optimists even envision new orders, now that the price of a plant has declined from its mid-1980s peak. Today’s nuclear power is a mature technology that has settled its safety and regulatory issues. Nevertheless, its cost disadvantage relative to coal is even greater than it was in 1978. The Economist wrote recently that instead of being too cheap to meter, nuclear power could become “too costly to matter.” Moreover, a cautious American public still remembers Three Mile Island. Thus, while the number of plants in service is holding steady and their cumulative output continues to climb, barring a tax on carbon emissions or some other radical reordering of priorities, a half-century of effort has failed to disclose a means whereby nuclear power can be both adequately safe and cost-competitive with coal.
History holds examples of successful technologies that were developed at an artificially fast pace. Early nuclear power enthusiasts could point to the Manhattan Project, as well as to the triumphs of electronics, jet propulsion, and the missile and space programs. Nuclear submarines were also developed very rapidly, in part because their reactors were small enough to avoid problems of emergency core cooling. The pattern in each case was to wait for a crucial breakthrough and then go full speed ahead. It worked for those technologies, but with commercial nuclear power, it turned out to be exactly the wrong approach.