A Small Cost Will Avoid a Catastrophe

How much hotter will human-caused emissions of carbon dioxide make the planet this century? Jim Manzi is correct that “The United Nations Intergovernmental Panel on Climate Change (IPCC) represents the largest existing global effort to answer such technical questions.” But he is not correct that under the current “IPCC consensus forecast” for business-as-usual emissions (BAU) trends, “global temperatures will rise by about 3°C by the year 2100.”

As I explained in a recent Nature online article, the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F), which means that over much of the inland United States, temperatures would be about 15°F higher. At such warming, the world faces multiple miseries, including:

  1. Sea level rise of 80 feet to 250 feet at a rate of 6 inches a decade (or more).
  2. Desertification of one third of the planet and drought over half of the planet, plus the loss of all inland glaciers.
  3. More than 70% of all species going extinct, plus extreme ocean acidification.

Is it 100% certain that 1,000 ppm would result in the three major impacts above? Of course not. Climate scientists don’t spend a lot of time studying the impacts of 1,000 ppm, in part because they can’t believe humanity would be so self-destructive as to ignore their increasingly dire warnings and fail to stabilize at well below 550 ppm. And yet the IPCC itself warned last year that as global average temperature increase exceeds about 3.5°C [relative to 1980-99], model projections suggest significant extinctions (40-70% of species assessed) around the globe.

Such certainty is not possible for a climate transition that is completely unprecedented in the history of the human species. That said, the impacts are probably more likely to be worse than not. The catastrophes we can’t foresee may be just as serious, given that, for instance, no one foresaw that at a mere 385 ppm, warming would help spur an infestation that is wiping out essentially every major pine tree in British Columbia.

The traditional economic modeling work by Yale’s William Nordhaus cited by Manzi is quite irrelevant in the face of 1,000 ppm warming. In fact, even a 3% chance of a warming this great is enough to render useless all traditional cost-benefit analyses that argue for delay or only modest action, as Harvard economist Martin Weitzman has shown. Yet, absent immediate and strong action, the chances of such warming and such effects are not small, they are large — greater than 50%.

These impacts seem especially likely in a 1,000 ppm world given that the climate appears to be changing much faster than the IPCC had projected. The Greenland and Antarctic ice sheets already appear to be shrinking “100 years ahead of schedule” as Penn State climatologist Richard Alley put it in March 2006. Indeed, a number of peer-reviewed articles have appeared in the scientific literature in the past 18 months supporting the real possibility of a 6-inch-a-decade sea level rise by century’s end. I don’t know a single climate expert who thinks any significant amount of ice would survive a 1,000 ppm world.

As for desertification, “The unexpectedly rapid expansion of the tropical belt constitutes yet another signal that climate change is occurring sooner than expected,” noted one climate researcher in December. As a recent study led by NOAA noted, “A poleward expansion of the tropics is likely to bring even drier conditions to” the U.S. Southwest, Mexico, Australia and parts of Africa and South America.” Last year, Science (subs. req’d) published research that “predicted a permanent drought by 2050 throughout the Southwest” — levels of aridity comparable to the 1930s Dust Bowl would stretch from Kansas to California. And they were only looking at a 720 ppm case! The Dust Bowl was a sustained decrease in soil moisture of about 15% (“which is calculated by subtracting evaporation from precipitation”).

Even the one-third desertification of the planet by 2100 scenario by the UK’s Hadley Center is only based on 850 ppm (in 2100). Princeton has done an analysis on “Century-scale change in water availability: CO2-quadrupling experiment,” which is to say 1100 ppm. The grim result: Most of the South and Southwest ultimately sees a 20% to 50% (!) decline in soil moisture.

How fast we can hit 1,000 ppm? The Hadley Center, the U.K.’s official center for climate change research, has one of the few models that incorporates many of the major carbon cycle feedbacks. In a 2003 Geophysical Research Letters (subs. req’d) paper, “Strong Carbon Cycle Feedbacks in a Climate Model with Interactive CO2 and Sulphate Aerosols,” the Hadley Center found that the world would hit 1,000 ppm in 2100 even in a scenario that, absent those feedbacks, would only have hit 700 ppm in 2100. I would note that the Hadley Center, though more inclusive of carbon cycle feedbacks than most other models, still does not model any feedbacks from the melting of the tundra even though it is probably the most serious of those amplifying feedbacks.

Clearly, anywhere near 1,000 ppm would be ruinous to the nation and the world, creating unimaginable suffering and misery for billions and billions of people for centuries to come. No one who believes in science and cares about humanity can possibly believe it is rational or moral to come anywhere near 1,000 ppm or the tipping points in the carbon cycle.

Where exactly are those tipping points? The Hadley work suggests it is well below 700 ppm. The tipping point most climate scientists I know worry about is the point at which we start to lose a substantial fraction of the tundra’s carbon to the atmosphere — substantial being 0.1% per year! After all, the tundra contains nearly 1,000 billion metric tons of carbon (some 3,600 billion metric tons of carbon dioxide). That exceeds all the carbon currently in the atmosphere. And much of it is expected to be released in the form of methane, which is 20 times more potent a greenhouse gas than carbon dioxide.

What is the point of no return for the tundra? A major 2005 study (subs. req’d) led by NCAR climate researcher David Lawrence, found that virtually the entire top 11 feet of permafrost around the globe could disappear by the end of this century.

Using the first “fully interactive climate system model” applied to study permafrost, the researchers found that if we tried to stabilize CO2 concentrations in the air at 550 ppm, permafrost would plummet from over 4 million square miles today to 1.5 million. If concentrations hit 690 ppm, permafrost would shrink to just 800,000 square miles.

ncar.jpg

This suggests that we had better stay well below 550 ppm if we are going to avoid destruction of the tundra. As Hadley found, this event will be followed closely by the catastrophic “loss of soil carbon due to enhanced soil respiration as the climate warms and dieback of the Amazon forest [begins] due to regional rainfall reduction.”

And that means current CO2 levels are already too high. And that means, contrary to what Manzi said, immediate action is required. So our choice is really to stay below 450 ppm or risk self-destruction. That’s why climate scientists are so damn desperate these days. That’s why a non-alarmist guy like Rajendra Pachauri — specifically chosen as IPCC chair in 2002 after the Bush administration waged a successful campaign to have him replace the outspoken Dr. Robert Watson — said in November:

If there’s no action before 2012, that’s too late. What we do in the next two to three years will determine our future. This is the defining moment.”

That’s why more than 200 scientists took the remarkable step of issuing a plea at the United Nations climate change conference in Bali. Global greenhouse gas emissions, they declared, “must peak and decline in the next 10 to 15 years, so there is no time to lose.” The AP headline on the statement was “Scientists Beg for Climate Action.”

The good news is that a host of independent analyses, including from the IPCC itself, make clear that the cost of keeping carbon dioxide concentrations at or below 450 ppm is very low indeed. Let’s just look at what the IPCC said in their most recent assessment, since, as Manzi says, it “represents the largest existing global effort to answer such technical questions.” In its definitive scientific synthesis report from November 2007, the IPCC and its member governments — including the Bush administration — agree that action is very affordable:

In 2050, global average macro-economic costs for mitigation towards stabilisation between 710 and 445 ppm CO2-eq… corresponds to slowing average annual global GDP growth by less than 0.12 percentage points.

How can the world’s leading governments and scientific experts agree that we can avoid catastrophe for such a small cost? Because that’s what the scientific and economic literature — and real-world experience — says:

Both bottom-up and top-down studies indicate that there is high agreement and much evidence of substantial economic potential for the mitigation of global GHG emissions over the coming decades that could offset the projected growth of global emissions or reduce emissions below current levels.

In fact, the bottom up studies — the ones that look technology by technology, which I have always believed are more credible — have even better news:

Bottom-up studies suggest that mitigation opportunities with net negative costs have the potential to reduce emissions by around 6 GtCO2-eq/yr in 2030.

That is, a 20% reduction in global emissions might be possible in a quarter century with net economic benefits. But don’t we need new technologies? Of course, but we don’t need — and can’t afford — to sit on our hands when we have so many cost-effective existing technologies:

There is high agreement and much evidence that all stabilisation levels assessed can be achieved by deployment of a portfolio of technologies that are either currently available or expected to be commercialised in coming decades, assuming appropriate and effective incentives are in place for their development, acquisition, deployment and diffusion and addressing related barriers.

The bottom lines is that if we want to avoid destroying life on the Earth as we know it today we need to do two things at once: aggressively deploy existing technology (with a variety of government policies, including carbon prices and government standards) and aggressively finish developing and commercializing key technologies and systems that are in the pipeline. Anyone who argues for just doing the latter is disputing a very broad scientific and economic understanding.

Also from this issue

Lead Essay

  • The prospect of potentially catastrophic global warming forces us to make decisions under extreme uncertainty. Yet, Jim Manzi writes, “Despite the rhetoric, the best available estimate of the damage we face from unconstrained global warming is not ‘global destruction,’ but is instead costs on the order of 3 percent of global GDP in a much wealthier world well over a hundred years from now.” Manzi explores how best to evaluate the costs of greenhouse gas abatement on the present-day economy when compared to the long-term benefits of avoiding global warming. He concludes that there are very few benefits from these steps.

Response Essays

  • American Progress Senior Fellow Joseph Romm argues that atmospheric CO2 has already reached an unacceptable level, and that urgent action is needed in the next few years. Fortunately, this action need not involve prohibitive costs. Indeed, many possible options for greenhouse gas abatement will result in economic benefits.

    These changes are desperately needed, too, before global warming reaches a tipping point beyond which the carbon sequestered in permafrost is also released into the atmosphere, aggravating the problem. Should we fail to act, widespread desertification, massive species extinction, and other catastrophic events are predicted, even by authorities whom Jim Manzi also accepts.

  • Indur Goklany argues, in response to Jim Manzi and Joseph Romm, that solving the likely problems resulting from global warming will be both cheaper and more effective than any global response aimed at stabilizing or changing the climate itself. Harm reduction will also pay important dividends regardless of the degree of global warming, since it will include the development of new treatments for diseases, better flood protection, improved crops, and general economic advancements for the developing world. When taken together, these factors will help us to face any global warming scenario effectively, and they will also offer even larger benefits outside any considerations of climate.

  • Michael Shellenberger and Ted Nordhaus describe what they see as a significant political realignment: Both left and right, they claim, are converging on a state-sponsored and technology-based solution to global warming, one that will emphasize clean energy and/or carbon sequestration technologies. They argue that the debate about climate modeling is largely irrelevant and/or unproductive, because these technologies are generally agreed to be important in their own right and to have positive economic effects regardless of the degree of severity of global warming. They call on policymakers to embrace a large-scale, state-funded effort to achieve these breakthrough technologies and argue that state sponsorship for technological advancement is, historically speaking, the engine of much progress and innovation. This, they argue, is a reason to embrace the same approach with regard to global warming.