HALF A CENTURY ago, nuclear power was on track to out-compete fossil fuels around the globe, which would have reduced the price of electricity, the amount of harmful air pollution, and greenhouse gas emissions associated with climate change. Then came a dramatic slowing of new construction and research into safer and more efficient nuclear reactors.
According to Australian National University researcher Peter Lang, the '60s and '70s saw a transition "from rapidly falling costs and accelerating deployment to rapidly rising costs and stalled deployment." Had the initial trajectory continued, he writes in the journal Energies, nuclear-generated electricity would now be around 10 percent of its current cost. In a counterfactual scenario featuring increasing uptake of nuclear power from 1976, Lang calculates that by 2015 it would have replaced all coal-burning and three-quarters of gas-fired electric power generation. Thus, over the past 30 years we could have substituted 186,000 terawatt-hours of electricity production, avoiding up to 174 gigatons of carbon dioxide emissions and 9.5 million air pollution deaths. Cumulative global carbon dioxide emissions would be about 18 percent lower, and annual global carbon dioxide emissions would be one-third less.
THE OYSTER CREEK Nuclear Generating Station in New Jersey opened in 1969. It cost $594 million (in 2017 dollars) and took four years to build. America's newest nuclear plant, at Watts Barin Tennessee, opened in 2016. It cost $7 billion and took more than 10 years to complete.
What happened? Anti-nuclear activism and regulation.
The 1971 D.C. Circuit Court case...