About this Issue

Global warming conjures up images of flooded cities, hurricanes, devastated coastlines, and (as on our banner this month) rather desperate polar bears. While virtually no one doubts the reality of climate change, assessing its extent and crafting a prudent and proportional response raises problems of its own.

Which, if any, of the many climate change estimates are accurate? How bad will the damage be? Which approaches offer the best value in terms of protecting property and natural resources, while generating the fewest risks and side effects of their own? In short: How much would we or should we pay today for a future without global warming?

The problem grows more difficult when we realize that proposed global warming solutions have often been victims of the domestic political process — rightly or wrongly — or else have been unacceptable to developing nations. Understandably, these nations may value the social and economic advancement of their own citizens above whatever environmental damage may or may not result from global warming. At least for now, the only path out of poverty may still be paved with coal.

To discuss the way forward on this complex and truly global issue, we have invited Jim Manzi, statistician and Chief Executive Officer of Applied Predictive Technologies, whose proposals to conservatives on the issue have generated significant discussion. In response to his essay, we have invited environmental expert and frequent Cato Institute author Indur Goklany; climate scientist Joseph J. Romm, a Senior Fellow at the Center for American Progress; and Michael Shellenberger and Ted Nordhaus, the co-founders of The Breakthrough Institute, a think tank whose mission includes encouraging an “equitable and accelerated transition to the clean energy economy.”

 

Lead Essay

Keeping Our Cool: What to Do about Global Warming

The danger of potentially catastrophic global warming is an almost paradigmatic case of decisionmaking under conditions of extreme uncertainty. Of course, this is just another way of saying that many of the intellectual sinews of libertarianism are central to thinking through this problem.

Let’s start at the beginning.

Carbon dioxide (CO2) is a greenhouse gas, simply meaning that it absorbs and redirects infrared radiation but not shorter-wavelength radiation. All else equal, the more CO2 molecules in the atmosphere, the hotter it gets. How much hotter is a complicated question and has been the subject of intense scientific inquiry over the past several decades.

The United Nations Intergovernmental Panel on Climate Change (IPCC) represents the largest existing global effort to answer such technical questions. The current IPCC consensus forecast is that, under fairly reasonable assumptions for world population and economic growth, global temperatures will rise by about 3 °C by the year 2100. Also according to the IPCC, a 4 °C increase in temperatures would cause total estimated economic losses of 1–5 percent of global GDP. By implication, if we were at 3 °C of warming at the end of this century, we would be well into the 22nd century before we reached a 4 °C rise, with this associated level of cost.

This is the central problem for advocates of rapid, aggressive emissions reductions. Despite the rhetoric, the best available estimate of the damage we face from unconstrained global warming is not “global destruction,” but is instead costs on the order of 3 percent of global GDP in a much wealthier world well over a hundred years from now.

It should not, therefore, be surprising that formal efforts to weigh the near-term costs of emissions abatement against the long-term benefits from avoided global warming show few net benefits, even in theory. According to the modeling group led by William Nordhaus, a Yale professor widely considered to be the world’s leading expert on this kind of assessment, an optimally designed and implemented global carbon tax would provide an expected net benefit of around $3 trillion, or about 0.2 percent of the present value of global GDP over the next several centuries. While not everything that matters can be measured by money, this certainly provides a different perspective than the “Earth in the balance” rhetoric would suggest.

Now, in absolute terms, $3 trillion is normally thought of as an amount of money that’s worth pursuing. So why shouldn’t we implement such a tax?

To understand why, let’s move from the world of academic model-building to the real world of geostrategic competition and domestic politics. To realize this gain of $3 trillion, we would have to agree to, and enforce, a global, harmonized tax on all significant uses of carbon and other greenhouse gases in any material form. This would require the agreement of — just to take a few examples — the Parliament of India, the Brazilian National Congress, the Chinese Politburo, Vladimir Putin, John Dingell, and the U.S. ethanol lobby. Each of these entities and individuals has been known to elevate narrow, sectarian interests above the comprehensive good of all mankind through all time, to put it mildly. Not to mention the fact that the governments of China and India, the most important emitting nations of the 21st century, continue to reiterate in formal, public statements that they have no intention of sacrificing economic development in order to reduce emissions.

All this aside, let’s imagine we actually could negotiate such a binding agreement. Isn’t it possible that all the side deals that would be required to get this done would create enough economic drag to more than offset the benefit of 0.2 percent of present value of global output? Our track record in closing and implementing such deals as the Kyoto Protocol, or even the current round of WTO negotiation — which, remember, is supposed to make the signatories richer — shouldn’t inspire much confidence that the theoretical net benefits will outweigh the costs created by the agreement. Indeed, today we are not even considering an actual U.S. carbon tax, which is preferred by almost all academic economists for this purpose, but instead a cap-and-trade system (i.e., emissions rationing) because it is more politically palatable to hide the costs to consumers this way. Yet even the staggering list of side-deals, offsets, special auctions, and so forth that were added to the Lieberman–Warner cap-and-trade bill were not enough to build a winning congressional coalition within this inefficient framework.

Further, even if we got to an agreement de jure, we would then have to enforce a set of global laws for hundreds of years that would run directly contrary to the narrow self-interest of most people currently alive on the planet. How likely do you think a rural Chinese official would be to enforce the rules on a local coal-fired power plant? These bottom-up pressures would likely render such an agreement a dead letter, or at least make it in effect a tax applicable only to the law-abiding developed countries that represent an ever-shrinking share of global carbon emissions.

In summary, then, the best available models indicate that 1) global warming is a problem that is expected to have only a limited impact on the world economy and 2) it is economically rational only to reduce slightly this marginal impact through global carbon taxes. Further, practical knowledge of the world indicates that 1) such a global carbon-tax regime would be very unlikely ever to be implemented, and 2) even if it were implemented, the theoretical benefits it might create would almost certainly be more than offset by the economic drag such a regime would produce. Other than that, it sounds like a great idea.

One objection that has been raised against this analysis is that the discount rate embedded in these calculations is immoral because we shouldn’t privilege our welfare at the expense of future generations. The most prominent document arguing for this approach is the “Stern Review,” produced by the British government in 2006. It is cited frequently as demonstrating that the world should begin immediate, aggressive emissions abatement. But consider the practical realism of the Stern discount rate assumption. Imagine a scenario in which global warming would lead to zero costs between now and the year 2200, at which point global economic growth would be permanently reduced by 0.1 percent — in other words, that economic output starting in 2200 would be 99.9 percent of what it would have been had there been no global warming. Under this scenario, how much should we be willing to pay today as a lump sum to avoid this cost? Using the assumptions of the Stern Review, we would pay about $30 trillion, or about half of the world’s entire annual economic output.

Thanks, but no thanks. Basically, you can’t find a discount rate low enough to lead you to give up a lot of wealth today in return for a few percent of wealth many decades from now, that doesn’t also lead to other results that no practical person would accept.

A second, much more serious, objection is that our forecasts for warming impacts might be wrong, and global warming could turn out to be substantially worse than these models predict. Climate models are, at a minimum, non-validated. Predicting the cost impact of various potential warming scenarios requires us to concatenate these climate predictions with economic models that predict the cost impact of these predicted temperature changes on the economy in the 21st, 22nd, and 23rd centuries. It is hubris to imagine that these can guarantee accuracy, and impossible to validate such a claim in any event.

Now, climate and economics modelers aren’t idiots, so it’s not like this hasn’t occurred to them. Competent modelers don’t assume only the most likely case, but build probability distributions for levels of warming and associated economic impacts (e.g., there is a 5 percent chance of 4.5 °C warming, a 10 percent chance of 4.0 °C warming, and so on). The economic calculations that comprise, for example, the analysis by William Nordhaus that I referenced earlier are executed in just this manner. So, the possibility of “worse than expected” impacts really means, more precisely, “worse than our current estimated probability distribution.” That is, we are concerned here with the inherently unquantifiable possibility that our probability distribution itself is wrong.

So, once we clear away the underbrush, we can see that the case for a carbon tax or a cap-and-trade emissions rationing system is really that it would be a hedge against the risk that actual damages from warming would be much, much worse than current risk-adjusted projections indicate. Would it be a smart investment?

To evaluate this, start with the observation that the primary purpose of such a tax or rationing system is not to encourage conservation per se, but rather to induce the development of new technologies that can de-link economic growth from damaging accumulations of atmospheric carbon dioxide. Increasing the price or scarcity of carbon would cause some direct reduction in fossil-fuel consumption (e.g., biking to work instead of driving), and get more people to use some pre-existing technologies (e.g., efficient light bulbs), but these effects would be limited. Hairshirts are not enough. We would have to develop new technologies that use energy more efficiently, emit less carbon per unit of energy, remove carbon from the atmosphere, and/or reduce the harm done by carbon dioxide. The real costs of a program to address global warming are crucially dependent on how much time and money it would take to develop and diffuse these technologies, plus the incremental costs per unit of energy (if any) they would impose once deployed.

This explains why carbon tax or rationing advocates pay lip service to the naïve idea that the developing world will impose large carbon taxes if we just “lead by example.” Of course, under the reasonable assumption that the relevant technologies will be developed primarily in the United States, Europe, Japan, Canada, and Australia, it doesn’t really matter that much whether the developing world puts a price on carbon or not. The global crusade is a smokescreen. The goal is to create an artificial scarcity of carbon in the developed world.

If you made such a tax high enough, or rationing stringent enough, this would probably “work,” in that it would spur new technological development; but it would be insanely expensive. It’s hard to predict accurately the costs of such a program, but even the kinds of programs currently under debate would certainly start at hundreds of billions of dollars in the United States alone, and would very likely not be high enough to successfully incentivize the creation of the desired technologies.

The only real argument for rapid, aggressive emissions abatement boils down to the point that you can’t prove a negative. If it turns out that even the outer edge of the probability distribution of our predictions for global-warming impacts is enormously conservative, and disaster looms if we don’t change our ways radically and this instant, then we really should start shutting down power plants and confiscating cars tomorrow morning. We have no good evidence that such a disaster scenario is imminent, but nobody can conceivably prove it to be impossible. Once you get past the table-pounding, any rationale for rapid emissions abatement that confronts the facts in evidence is really a more or less sophisticated restatement of the precautionary principle: the somewhat grandiosely named idea that the downside possibilities are so bad that we should pay almost any price to avoid almost any chance of their occurrence.

But to force massive change in the economy based on such a fear is to get lost in the hothouse world of single-issue advocates, and become myopic about risk. We face lots of other unquantifiable threats of at least comparable realism and severity. A regional nuclear war in Central Asia, a global pandemic triggered by a modified version of HIV, or a rogue state weaponizing genetic engineering technology all come immediately to mind. Any of these could kill hundreds of millions of people. Scare stories are meant to be frightening, but we shouldn’t become paralyzed by them.

In the face of massive uncertainty on multiple fronts the best strategy is almost always to hedge your bets and keep your options open. Wealth and technology are raw materials for options. The loss of economic and technological development that would be required to eliminate literally all theorized climate change risk would cripple our ability to deal with virtually every other foreseeable and unforeseeable risk, not to mention our ability to lead productive and interesting lives in the meantime. The precautionary principle is a bottomless well of anxieties, but our resources are finite — it’s possible to buy so much flood insurance that you can’t afford fire insurance.

So if there is a real, though unquantifiably small, possibility of catastrophic climate change, and if we would ideally want some technological hedges as insurance against this unlikely scenario, and if raising the price of carbon to induce private economic actors to develop the technologies would be an enormously more expensive means of accomplishing this than would be advisable, then what, if anything, should we do about the danger?

One obvious approach is to have the government fund technology research directly. The danger here, of course, is that we end up back in the failed game of industrial policy. Such dangers are not merely theoretical. The federal government was the key sponsor of, for example, the shale oil and large-scale wind turbine debacles in response to the energy crisis thirty years ago. Setting the right scope for such a program and managing the funding process carefully would each be essential to prevent it from becoming corporate welfare.

We should limit government investments to those topics that meet specific criteria. They should be related to detecting or ameliorating the effects of global warming, should serve a public rather than a private need, and should provide no obvious potential source of profit to investors if successful. Important examples include improved global climate prediction capability, visionary biotechnology to capture and recycle carbon dioxide emissions, or geo-engineering projects to change the albedo of the earth’s surface or atmosphere. On the other hand, most technologies that would contribute to the ongoing long-run transition of the economy away from fossil fuels, like more efficient fuel cells for autos or lower-cost solar power sources, need no government funding, since there is ample profit motive to develop them. As evidence, massive amounts of venture funding and large-company internal capital allocations are flowing to these opportunities right now. Government attempts to direct such development would almost certainly destroy value through political allocation of resources.

The agency for funding any government-sponsored research should be explicitly modeled on the Defense Advanced Research Projects Agency (DARPA). The character of such an agency would be a very high-IQ staff with wide flexibility in providing small grants. In addition, this program should have a heavy emphasis on large prizes for accomplishing measurable and audacious goals. As an example, the British entrepreneur Richard Branson has offered a $25 million dollar prize to anyone who demonstrates a device that removes carbon from the atmosphere — what if the U.S. government upped the ante to $1 billion and pledged to make any resulting technology freely available to the world? That would hold the potential for solving any global warming problem that might develop to a one-time cost of less than 0.01% of U.S. GDP.

The incremental cost of this research program could be single-digit billions per year, hopefully with partially offsetting spin-off benefits. DARPA’s total annual budget is about $3 billion, and unlike Al Gore it really did invent the Internet (original name, ARPANet). In fact, it’s important that the honeypot be kept small enough, and be doled out in small enough increments, that it’s not worth it for either Congress or Fortune 100 companies to try to direct the spending politically.

Of course, it would still be a government program, and therefore rife with inefficiencies. But consider that its costs would be on the order of 1/100th of the costs of imposing a large U.S. carbon tax. It could be massively inefficient and we would still be far better off in actually developing the long–lead-time technologies that we would want if faced with a currently unanticipated emergency.

Hedging against the risk to future generations of potential unanticipated impacts from global warming is a legitimate job for the U.S. government. Ideally, it would be tackled by the governments of the small number of countries with a sophisticated technology development capability acting in some kind of coordinated fashion. A massive carbon tax, a cap-and-trade rationing system, and the attempt to use the government to control the evolution of the energy sector of the economy are all billed as prudent reactions to this risk, but each is the opposite: an impractical, panicky reaction unworthy of a serious government.

Response Essays

A Small Cost Will Avoid a Catastrophe

How much hotter will human-caused emissions of carbon dioxide make the planet this century? Jim Manzi is correct that “The United Nations Intergovernmental Panel on Climate Change (IPCC) represents the largest existing global effort to answer such technical questions.” But he is not correct that under the current “IPCC consensus forecast” for business-as-usual emissions (BAU) trends, “global temperatures will rise by about 3°C by the year 2100.”

As I explained in a recent Nature online article, the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F), which means that over much of the inland United States, temperatures would be about 15°F higher. At such warming, the world faces multiple miseries, including:

  1. Sea level rise of 80 feet to 250 feet at a rate of 6 inches a decade (or more).
  2. Desertification of one third of the planet and drought over half of the planet, plus the loss of all inland glaciers.
  3. More than 70% of all species going extinct, plus extreme ocean acidification.

Is it 100% certain that 1,000 ppm would result in the three major impacts above? Of course not. Climate scientists don’t spend a lot of time studying the impacts of 1,000 ppm, in part because they can’t believe humanity would be so self-destructive as to ignore their increasingly dire warnings and fail to stabilize at well below 550 ppm. And yet the IPCC itself warned last year that as global average temperature increase exceeds about 3.5°C [relative to 1980-99], model projections suggest significant extinctions (40-70% of species assessed) around the globe.

Such certainty is not possible for a climate transition that is completely unprecedented in the history of the human species. That said, the impacts are probably more likely to be worse than not. The catastrophes we can’t foresee may be just as serious, given that, for instance, no one foresaw that at a mere 385 ppm, warming would help spur an infestation that is wiping out essentially every major pine tree in British Columbia.

The traditional economic modeling work by Yale’s William Nordhaus cited by Manzi is quite irrelevant in the face of 1,000 ppm warming. In fact, even a 3% chance of a warming this great is enough to render useless all traditional cost-benefit analyses that argue for delay or only modest action, as Harvard economist Martin Weitzman has shown. Yet, absent immediate and strong action, the chances of such warming and such effects are not small, they are large — greater than 50%.

These impacts seem especially likely in a 1,000 ppm world given that the climate appears to be changing much faster than the IPCC had projected. The Greenland and Antarctic ice sheets already appear to be shrinking “100 years ahead of schedule” as Penn State climatologist Richard Alley put it in March 2006. Indeed, a number of peer-reviewed articles have appeared in the scientific literature in the past 18 months supporting the real possibility of a 6-inch-a-decade sea level rise by century’s end. I don’t know a single climate expert who thinks any significant amount of ice would survive a 1,000 ppm world.

As for desertification, “The unexpectedly rapid expansion of the tropical belt constitutes yet another signal that climate change is occurring sooner than expected,” noted one climate researcher in December. As a recent study led by NOAA noted, “A poleward expansion of the tropics is likely to bring even drier conditions to” the U.S. Southwest, Mexico, Australia and parts of Africa and South America.” Last year, Science (subs. req’d) published research that “predicted a permanent drought by 2050 throughout the Southwest” — levels of aridity comparable to the 1930s Dust Bowl would stretch from Kansas to California. And they were only looking at a 720 ppm case! The Dust Bowl was a sustained decrease in soil moisture of about 15% (“which is calculated by subtracting evaporation from precipitation”).

Even the one-third desertification of the planet by 2100 scenario by the UK’s Hadley Center is only based on 850 ppm (in 2100). Princeton has done an analysis on “Century-scale change in water availability: CO2-quadrupling experiment,” which is to say 1100 ppm. The grim result: Most of the South and Southwest ultimately sees a 20% to 50% (!) decline in soil moisture.

How fast we can hit 1,000 ppm? The Hadley Center, the U.K.’s official center for climate change research, has one of the few models that incorporates many of the major carbon cycle feedbacks. In a 2003 Geophysical Research Letters (subs. req’d) paper, “Strong Carbon Cycle Feedbacks in a Climate Model with Interactive CO2 and Sulphate Aerosols,” the Hadley Center found that the world would hit 1,000 ppm in 2100 even in a scenario that, absent those feedbacks, would only have hit 700 ppm in 2100. I would note that the Hadley Center, though more inclusive of carbon cycle feedbacks than most other models, still does not model any feedbacks from the melting of the tundra even though it is probably the most serious of those amplifying feedbacks.

Clearly, anywhere near 1,000 ppm would be ruinous to the nation and the world, creating unimaginable suffering and misery for billions and billions of people for centuries to come. No one who believes in science and cares about humanity can possibly believe it is rational or moral to come anywhere near 1,000 ppm or the tipping points in the carbon cycle.

Where exactly are those tipping points? The Hadley work suggests it is well below 700 ppm. The tipping point most climate scientists I know worry about is the point at which we start to lose a substantial fraction of the tundra’s carbon to the atmosphere — substantial being 0.1% per year! After all, the tundra contains nearly 1,000 billion metric tons of carbon (some 3,600 billion metric tons of carbon dioxide). That exceeds all the carbon currently in the atmosphere. And much of it is expected to be released in the form of methane, which is 20 times more potent a greenhouse gas than carbon dioxide.

What is the point of no return for the tundra? A major 2005 study (subs. req’d) led by NCAR climate researcher David Lawrence, found that virtually the entire top 11 feet of permafrost around the globe could disappear by the end of this century.

Using the first “fully interactive climate system model” applied to study permafrost, the researchers found that if we tried to stabilize CO2 concentrations in the air at 550 ppm, permafrost would plummet from over 4 million square miles today to 1.5 million. If concentrations hit 690 ppm, permafrost would shrink to just 800,000 square miles.

ncar.jpg

This suggests that we had better stay well below 550 ppm if we are going to avoid destruction of the tundra. As Hadley found, this event will be followed closely by the catastrophic “loss of soil carbon due to enhanced soil respiration as the climate warms and dieback of the Amazon forest [begins] due to regional rainfall reduction.”

And that means current CO2 levels are already too high. And that means, contrary to what Manzi said, immediate action is required. So our choice is really to stay below 450 ppm or risk self-destruction. That’s why climate scientists are so damn desperate these days. That’s why a non-alarmist guy like Rajendra Pachauri — specifically chosen as IPCC chair in 2002 after the Bush administration waged a successful campaign to have him replace the outspoken Dr. Robert Watson — said in November:

If there’s no action before 2012, that’s too late. What we do in the next two to three years will determine our future. This is the defining moment.”

That’s why more than 200 scientists took the remarkable step of issuing a plea at the United Nations climate change conference in Bali. Global greenhouse gas emissions, they declared, “must peak and decline in the next 10 to 15 years, so there is no time to lose.” The AP headline on the statement was “Scientists Beg for Climate Action.”

The good news is that a host of independent analyses, including from the IPCC itself, make clear that the cost of keeping carbon dioxide concentrations at or below 450 ppm is very low indeed. Let’s just look at what the IPCC said in their most recent assessment, since, as Manzi says, it “represents the largest existing global effort to answer such technical questions.” In its definitive scientific synthesis report from November 2007, the IPCC and its member governments — including the Bush administration — agree that action is very affordable:

In 2050, global average macro-economic costs for mitigation towards stabilisation between 710 and 445 ppm CO2-eq… corresponds to slowing average annual global GDP growth by less than 0.12 percentage points.

How can the world’s leading governments and scientific experts agree that we can avoid catastrophe for such a small cost? Because that’s what the scientific and economic literature — and real-world experience — says:

Both bottom-up and top-down studies indicate that there is high agreement and much evidence of substantial economic potential for the mitigation of global GHG emissions over the coming decades that could offset the projected growth of global emissions or reduce emissions below current levels.

In fact, the bottom up studies — the ones that look technology by technology, which I have always believed are more credible — have even better news:

Bottom-up studies suggest that mitigation opportunities with net negative costs have the potential to reduce emissions by around 6 GtCO2-eq/yr in 2030.

That is, a 20% reduction in global emissions might be possible in a quarter century with net economic benefits. But don’t we need new technologies? Of course, but we don’t need — and can’t afford — to sit on our hands when we have so many cost-effective existing technologies:

There is high agreement and much evidence that all stabilisation levels assessed can be achieved by deployment of a portfolio of technologies that are either currently available or expected to be commercialised in coming decades, assuming appropriate and effective incentives are in place for their development, acquisition, deployment and diffusion and addressing related barriers.

The bottom lines is that if we want to avoid destroying life on the Earth as we know it today we need to do two things at once: aggressively deploy existing technology (with a variety of government policies, including carbon prices and government standards) and aggressively finish developing and commercializing key technologies and systems that are in the pipeline. Anyone who argues for just doing the latter is disputing a very broad scientific and economic understanding.

Reducing Vulnerability to Climate-Sensitive Risks is the Best Insurance Policy

Jim Manzi, noting that global warming will have relatively modest impacts on global economic product, argues for a DARPA-like government-supported program to expand technological options (e.g., developing backstop options such as geoengineering or “visionary biotechnology”) as a relatively cheap hedge in case the impacts of warming turn out be worse than generally expected. However, Joseph Romm argues that “immediate action is required” and that the costs of staving off the risk of “self destruction” would be modest.

I will assume that Romm is right in that something should be done about climate change beyond Manzi’s research approach, although I don’t share Romm’s faith in the results of climate modeling and in their implications for action. First, there are unanswered questions regarding the quality and integrity of surface temperature data that must necessarily be used to develop, calibrate, verify, and validate climate models (See Watts 2007, Watts 2008, Hale et al. 2006 (subscription required), Pielke et al. 2007a (pdf), Pielke et al. 2007b (subscription required), and McKitrick and Michaels 2007 (pdf)). Second, it’s unknown how much confidence, if any, should be placed not only on climate models (Koutsoyiannis et al. 2008) but also biophysical (Botkin et al. 2007 (pdf)) and socioeconomic models (Goklany 2007 (pdf)) that use the outputs of these climate models to estimate future impacts. Typically, they aren’t verified and validated at the appropriate geographical scales with observational data outside of that used to develop the models themselves. Third, projections extending beyond a few decades are inherently suspect (see Goklany 2008a).

While Romm’s approach essentially focuses on climate change to the exclusion of other problems, I’ll take a broader perspective. First, I’ll examine whether climate change is the most important problem facing the world today and in the foreseeable future. Next I’ll address how we can most effectively reduce damages from climate change while also advancing human and environmental well-being.

My analysis uses results of studies by scientists who are in good standing with the IPCC. Specifically, it uses mortality estimates from the World Health Organization (WHO) and estimates of the global impacts of climate change from the British-government sponsored “Fast Track Assessments” (FTAs) which were extensively referenced in the Stern Review and the IPCC’s latest assessment. In fact, many FTA authors were major contributors to the IPCC’s latest assessment. Cost estimates are taken from the IPCC and the United Nations Millennium Project. Details on the analysis can be found in my paper “What to Do about Climate Change” (Goklany, 2008b).

With respect to the first issue, the WHO’s analysis indicates that climate change is responsible for less than 0.3 percent of the present-day global health and mortality burden (Goklany 2008c). In fact, a dozen other environmental, food, and nutrition-related risk factors contribute more to the global death toll than climate change. For example, hunger’s annual contribution is over twenty times larger, unsafe water’s is ten times larger, and malaria’s is six times larger. With respect to ecological factors, habitat conversion continues to be the single largest demonstrated global threat to species and biodiversity. Thus climate change is not the most important problem facing today’s population.

With respect to the foreseeable future, which optimistically may extend to 2085 (if then), the FTAs indicate that under the IPCC’s warmest (A1FI) scenario, which the Hadley Center’s HadCM3 model projects will increase average global temperature by 4°C between 1990 and 2085, climate change will contribute about 10 percent of the cumulative death toll from hunger, malaria — a surrogate for vector-borne diseases in general — and flooding (figure 1).

Figure 1: Cumulative mortality in 2085 under various IPCC scenarios. The IPCC’s labels for these scenarios (A1FI, A2, B2, B1) and the corresponding Hadley Center estimate of average global temperature increase between 1990 and 2085 are shown under each bar.

Surprisingly, climate change would reduce the net global population at risk of water stress (Goklany 2008a).

Regarding environmental well-being, the FTA results indicate that in 2100, under the same scenario, despite a population increase, cropland could decline from 11.6 percent in the base year (1990) to less than half that (5.0 percent) (Goklany 2008d). That is, climate change may well relieve today’s largest threat to species and biodiversity! Also, non-climate-change related factors will dominate the global loss of coastal wetlands between 1990 and 2085.

Clearly, other problems outrank climate change now and through the foreseeable future, whether the concern is human well-being or global ecology. And thus these other environmental and public health problems should take precedence over climate change.

Let’s now examine what’s the most effective method of reducing damages from climate change, while also advancing well-being.

Since climate change contributes 10 percent of the global mortality from hunger, malaria, and coastal flooding in 2085 (under the warmest scenario), rolling back climate change to its 1990 level (i.e., “maximum mitigation”) would at most reduce mortality from these three risk factors by 10 percent. Thus, even if we eliminate climate change, annual mortality would vary from 2 million to 6 million in 2085, depending on the IPCC scenario employed. The Kyoto Protocol, on the other hand, would reduce climate change by less than 10 percent in 2085-2100. Hence, as a first approximation, had the United States participated and if all nations had met their obligations fully, the protocol would have reduced mortality by less than 1 percent (i.e., 10 percent of 10 percent) in 2085.

By contrast, if we focus our efforts on reducing societies’ vulnerabilities to hunger, malaria, and coastal flooding today, we would reduce not only the 10 percent of the problem due to climate change, but also the remainder of the problem, the 90 percent due to other factors, in the 2085 worst-case scenario. For example, with respect to malaria, worldwide vulnerability could be reduced through the development of a malaria vaccine, more effective insecticides, or improved therapies. Such measures would target the total malaria problem and reduce its toll regardless of whether it was caused by climate change or other factors.

Reduction of malaria vulnerability is an example of what I call “focused adaptation,” and this approach can be generalized to other climate-sensitive risks that non-climate-change-related factors also contribute to. Under focused adaptation, we would focus our efforts on enhancing resilience and reducing vulnerability to climate-sensitive problems that are urgent today and that could be exacerbated by future climate change. These include problems such as malaria and other vector-borne diseases, hunger, water shortages, threats to biodiversity, extreme weather events, and so forth. Significantly, focused adaptation would also begin to reduce current problems in short order (e.g., the annual death toll due to malaria, hunger, and flooding of 4 million), whereas any significant benefits from climate change mitigation would be delayed by decades because of the inertia of the climate system.

Significantly, the technologies and practices needed to deal with these problems today will be the basis for dealing with the same problems in the future whether they are caused by climate change or other factors.

Thus, focused adaptation would target all of the mortality due to hunger, malaria, and coastal flooding from now through the future, whereas maximal climate mitigation would target 0.3 percent (at most) of these problems today, rising to 10 percent in 2085.

The FTA results also indicate that mitigation could actually increase both the net global population at risk of water stress (Goklany 2008a), and habitat loss (Goklany 2008d). This illustrates a major, but often-ignored, drawback of mitigation, namely, that it reduces all impacts of climate change, whether good or bad, while adaptation allows us to be selective.

So through the foreseeable future, the potential benefits of focused adaptation far outweigh those from even maximum mitigation. But what about costs?

The Kyoto Protocol, despite its minimal effectiveness, is estimated to cost around $165 billion annually (in 2010-2015). Although the cost of maximum mitigation has never been estimated, suffice it to say that it would cost orders of magnitude more. I will assume a lower bound of $165 billion per year. As will become evident, the precise costs of mitigation don’t matter for this analysis because of the enormous mismatch between the cost-to-benefit ratios of the adaptive approach versus mitigation.

Regarding the costs of focused adaptation, the UN Millennium Project and the IPCC’s latest assessment indicate that by 2015, malaria could be reduced by 75 percent for $3 billion per year, hunger by 50 percent for $12-15 billion per year (e.g., through development of crops that would grow better in poor climatic or soil conditions such as drought, water-logging, and high salinity), and vulnerability to coastal flooding significantly reduced for $2-10 billion per year (e.g., through building and strengthening coastal defenses, insurance reform, and improving early warning systems).

Combining these estimates, and allowing for population increases, I estimate that focused adaptation could reduce mortality due to hunger, malaria, and flooding by 64 percent in 2085, at a cost of $34 billion annually, compared to a 10 percent reduction under maximum mitigation at an annual cost well above $165 billion, or simply a 1 percent reduction under the Kyoto Protocol at an annual cost of $165 billion.

There is another, broader method to reducing vulnerability to climate change. As Manzi notes, wealth and technology are the raw materials for developing future options. Also note that developing countries are most at risk from global warming not because they will experience greater climate change, but because they lack the adaptive capacity to cope with its impacts. Hence, another approach to addressing climate change would be to enhance their adaptive capacity by promoting broad development — economic development, human capital formation, and the propensity for technological change — which, of course, is the point of sustainable economic development.

Moreover, since the determinants of adaptive and mitigative capacity largely are the same, enhancing the former also should boost the latter. Perhaps more importantly, advancing economic development and human capital formation also would advance society’s ability to cope with all manner of threats, whether climate-related or not.

The costs and benefits of sustainable economic development can be estimated from work done on the UN’s Millennium Development Goals (MDGs), which were devised to promote sustainable development in the developing world. The benefits associated with these goals — halving global poverty, hunger, and the lack of access to safe water and sanitation; reducing child and maternal mortality by 66 percent or more; providing universal primary education; and reversing growth in malaria, HIV/AIDS, and other major diseases — would exceed the benefits flowing from the deepest mitigation and even focused adaptation. Yet, according to the UN Millennium Project, the additional annual cost to the richest countries of attaining the MDGs by 2015 is about $165 billion annually. That is approximately the same cost as that of the ineffectual — but expensive — Kyoto Protocol.

The following table summarizes the costs and benefits of the four mitigation and adaptation approaches discussed above. Note that red letters and negative numbers indicate a deteriorating situation:

Table 1: Costs and benefits of various mitigation and adaptation approaches.

The table shows that through the foreseeable future, vulnerability reduction will provide far greater benefits than even the deepest mitigation, and at a lower cost. And these conclusions hold regardless of the choice of discount rate, or fanciful scenarios beyond the foreseeable future.

Some have argued for mitigation as an insurance policy. Although mitigation (and R&D to expand mitigation options) makes sense so long as its implementation is neither mandatory nor subsidized, reducing vulnerability to current climate-sensitive problems and enhancing adaptive capacity is a far superior insurance policy. It will, unlike mitigation, pay handsome dividends now and in the future, whether or not the climate changes, or in whichever direction it does change. It will reduce risks faster, more effectively, more surely, and by a greater amount. No less important, it would provide the world the wherewithal to deal with a much wider array of future problems, whether they are related to climate or not. In short, vulnerability reduction allows us to solve the urgent problems facing today’s generations and improve their well-being while providing the best hedge for future generations as well.

The New Climate Center: How Technology Could Create a Political Breakthrough

For 20 years, liberals and conservatives have been locked in a debate about the relative seriousness of climate change. Conservatives have either denied that it was happening or played down its significance, while liberals and environmentalists have tended to see it as ecological apocalypse meriting either extreme personal sacrifice or a supposed cost-free regulatory fix.

That debate is now undergoing a major shift. Conservatives like Jim Manzi, Newt Gingrich, and others recognize that humans are affecting the climate and that something should be done about it. Liberals and environmentalists, like Joe Romm and most recently Al Gore, are beginning to recognize the political futility of peddling sacrifice, and have started emphasizing the need to make clean energy cheap. To be sure, both camps are still far apart in their view of global warming, with Romm seeing it as a future hell on earth and Manzi viewing it as little more than a rounding error. But if we fixate on these radically divergent views of the problem we risk missing some signs of agreement over what should be done about it.

The Model Muddle

Liberals and conservatives both rely on highly complex climate and economic models to inform their views of what should be done. The problem is not so much that the models are inaccurate as that they must, by their nature, produce a wide range of possible future scenarios. Models thus offer very little certainty upon which to base our actions. Will global warming result in so little damage that it is not worth investing any amount of money in cleaner energy sources? Or will it undermine the basis of human life on earth, which would merit extreme investments and personal sacrifice? Change a single decimal point on one of the hundreds of interrelated ecological or economic inputs — faster-than-expected emissions from China, melting tundra, diminished albedo, slower rates of deforestation, faster economic growth — and voila! you’ve constructed a radically different world.

One of the largest uncertainties is also the one that will have the largest impact on our ability to deal with the problem: technological innovation. If we bought enough of them, could solar panels one day become cheaper than coal? Could new air capture machines suck carbon dioxide out of the atmosphere and store it underground so cheaply as to obviate the need to slow emissions?

When modelers try to predict the effects of innovation on prices they tend to rely on learning curves. They allow us to make predictions about, for example, how many solar panels would need to be produced to become as cheap as coal. But while there is incremental technological progress there is also non-incremental progress, which can lead to the creation of either whole new learning curves, or disruptions to the old ones. Something like this occurred when, in the early 1990s, the Danish government started deploying massive wind turbines offshore, which produced energy far more cheaply than the older, smaller ones. Non-incremental progress (“breakthroughs”) and sharp price declines may occur with printed (“nano”) solar panels, air (CO2) capture devices, and new battery technologies — though probably only if we subsidize their mass manufacture and deployment.

As a consequence, the most important factor for determining the cost of doing something about climate change is the one that is most difficult, and arguably impossible, to model. The result is that most models fail to calculate the myriad benefits that come from investments in technology and tend to cast the transition to clean energy in zero-sum terms: Either we spend more on energy or we spend more money on preventing malaria and hunger. It is for this reason that Indur Golanky argues for money invested in adaptation instead of mitigation.

None of this is to say that modeling is pointless. Good models reveal myriad factors affecting the future. The best are transparent about their limited ability to predict technological change. But it is safe to say that conservatives and liberals could continue having the same argument for another twenty years and neither side could credibly claim victory.

Why We Can Disagree to Agree

Strange as it sounds, reasonable liberals and conservatives can disagree about the seriousness of the problem and still agree on solutions. But first they need to embrace two principles. The first is the Principle of Climate Uncertainty: Whatever action we take must be robust to ecological and economic uncertainty. Said differently, the policies we implement should be justifiable whether climate change results in floods, droughts, and food shortages, or merely the need to turn up the air conditioner.

The second is the Principle of Climate Politics. People — whether Americans, Europeans, Chinese, or Brazilians — want to do something about global warming; they just don’t want to pay much more for energy to do it. Thus, the world will only reduce emissions to the extent that clean alternatives to fossil fuels are cheap and available, or that there are cheap ways to capture and store emissions, or both.

If we can agree on these two principles, what then? First, we should make clean energy, and the capture and storage of emissions, cheap. Second, we should adapt to a warmer world.

Because voters and policymakers vote for policies, not predictions, they can support policies for a wide range of reasons that don’t require seeing the world in the same way. Subsidies for electric cars and trucks, transmission lines, R&D for new battery technologies, air capture, and a National Energy Education Act can be supported for ecological, national security, or economic reasons. Electric cars and trucks will reduce our dependence on foreign oil and could revitalize Detroit. Air capture technologies can also be used to make synthetic fuels. New battery technologies for storing solar and wind power, or powering electric cars, could also be used to increase the productivity of cell phones, laptops, and other electronic devices. Clean energy will reduce the medical costs associated with the premature death and disease caused by air pollution. And American workers and taxpayers would benefit from American leadership in the new energy industries, currently dominated by European and Asian firms.

Meanwhile, adaptation to a warmer world can be justified as intelligent planning (e.g., not building cities below sea level), as disaster preparedness (e.g., protecting vulnerable areas from drought, floods, and hurricanes), and as economic development (e.g., creating a secure built environment). But first we must stop viewing adaptation and mitigation as an either/or choice, which Golanky does for economic reasons, seeing mitigation as a zero-sum game, and which Romm does for political ones, seeing adaptation as strategy to downplay the seriousness of the challenge. But economically, many kinds of adaptation — such as urban planning and building design — could accelerate our transition to a more energy efficient economy, and thus mitigate global warming. And politically, adaptation has the potential to empower Americans to do something about a problem that many view as too big and difficult to solve.

The Central Role of Technology Policy

Government has an obvious role to play in enabling the emergence of new industries. The government subsidized the creation of canals, telegraph lines, railroads, electrical grid, highways, and the Internet. Government invests in R&D when private firms don’t, often to great result, as in the case of Xerox’s PARC lab, which created the windows computer interface, the mouse, and Ethernet technologies. Government funds the re-training the work force, as it did with the G.I. Bill and the National Defense Education Act. And when there is a promising but high-risk new technology, like computer microchips, or offshore wind turbines, governments subsidize deployment and commercialization.

All of that feels very complicated to many on both left and right who wish we could implement some simple silver policy bullet, like a price on carbon, or clean energy mandates on utilities. But no government will set a price on carbon high enough to make clean energy cost-competitive. Europe has a $40 per ton price on carbon dioxide, about five times the optimal level recommended by the model favored by Manzi, and yet the continent is still set to build 50 new coal plants over the next five years. Nor will governments create or enforce clean energy mandates on utilities that dramatically raise energy prices, which is what Romm’s plan would do. And new regulations or a price on carbon won’t create the requisite enabling infrastructure: the R&D labs, the scholarships, or the new grid.

Both Manzi and Romm say they support government investments. The question for everyone, including Congress, is: How much is needed and how should it be spent? Consider that the latest version of cap and trade would have raised $177 billion every year and cost the average American family several thousand dollars per year in higher gasoline and electricity costs. It is inconceivable that legislation raising energy prices that much will ever pass Congress — witness the about-face of Democrats to the Republican push for offshore oil drilling — even under a Democratic president and Congress.

A better public investment would be in the range of $30 to $80 billion annually, which would cost the average American less than a dollar a day. Even if the annual investment were on the low side, say $30 billion, it could be bonded against to pay for the large up-front capital investments, such as in a new grid. The revenue could come either from auctioning pollution permits, a “wires fee” on electricity use, a modest tax on carbon, or a mix of all three.

Manzi argues that government investment should go to “detecting or ameliorating the effects of global warming, should serve a public rather than a private need, and should provide no obvious potential source of profit to investors if successful.” But Manzi never says why. Given his own argument about the uncertainty of global warming’s effects, wouldn’t a better approach be to make investments that make sense no matter how serious global warming turns out to be? And while government contracts to private firms to do things like build transmission lines or conduct R&D should no doubt be transparent and competitive, why shouldn’t firms and investors be able to make a reasonable profit while doing so?

Manzi argues that investments should be overseen by an agency “explicitly modeled on the Defense Advanced Research Projects Agency (DARPA). The character of such an agency would be a very high-IQ staff with wide flexibility in providing small grants.” There is much to admire about DARPA, which Manzi points out invented the Internet. But there are limits to the model. DARPA invents military applications that, while expensive, are nowhere as capital-intensive as deploying advanced power plants, which can cost as much as $5 billion each, more than the U.S. government spends on energy R&D annually. Because many of the innovations and breakthroughs that occur in energy are through real-world deployment, not lab R&D, like offshore wind turbines, concentrated solar power, and carbon capture and storage, what’s required is not just a DARPA for energy but also investments in the enabling infrastructure and outright deployment in the real world.

But Manzi makes a good point that government investments to make clean energy cheap shouldn’t be doled out like so much pork to every congressional district in America. While a DARPA for energy R&D might be a good model, spending on deployment and infrastructure should be overseen by a diverse group of experts, perhaps modeled after the military base closings commission, whose recommendations could only receive an up or down vote from Congress (no amendments, no earmarks). That model protected the integrity of the decisionmaking process and protected members who lost a base but couldn’t easily be blamed for it by voters. This is important because not every state has the same potential for supporting energy sources or technologies.

To his credit, Romm has recently acknowledged the limits of cap and trade policies both in Europe and the U.S. But Romm never grapples with why cap and trade cannot achieve large emissions reductions, which is that voters and policymakers will not raise the price of fossil fuels high enough to make clean energy cost competitive. Romm thus proposes deployment mandates in addition to cap and trade, but he never says how such mandates would help the cap meet the Principle of Climate Politics.

After cap and trade was defeated in July, Romm fulminated against Republican obstructionism but ignored the reality that many Democrats opposed the bill fearing the impact it would have on the energy prices of their constituents. Romm behaves as though constantly emphasizing the threat posed by global warming will be sufficient to tip the political scales. It’s a position that depends on ignoring the overwhelming evidence that voters care more about energy prices than climate. Romm should stop seeing the call for government investment in technology innovation to make clean energy cheap as a nefarious delaying tactic and start seeing it as the basis for a new political center on climate.

Manzi, for his part, should address the capital-intensive nature of the energy sector, and the important role governments play in creating the enabling infrastructure for new industries. Manzi’s small-is-beautiful view of government leads him to turn DARPA into a fetish — a magical agency capable of generating wondrous technologies on the cheap. What Manzi doesn’t mention is that the Pentagon’s total R&D budget for 2008 was $78 billion, not DARPA’s $3 billion. The big breakthroughs in energy technology occur in real-world deployment, not in labs, which is why it was the Danes, who deployed offshore turbines, and not the Americans, stuck in their workshops, who came to dominate the global wind turbine market.

The right model for energy modernization is not the Manhattan or Apollo projects, nor DARPA, but rather the full suite of postwar investments in the Interstate Highway System, the G.I. Bill, microchips, personal computers, aerospace, the Internet, and the National Defense Energy Act. If Manzi thinks it is not the government’s role to make large investments to enable the emergence of new industries then he should explain how America could have become such a rich nation without having invested in the railroads, the highways, the electrical grid, the Internet, microchips, the computer sciences, and the biosciences.

The New Political Center on Climate

Both Romm and Manzi deserve credit for helping to move the debate away from a narrow focus on global warming and toward a debate over solutions. Now liberals and conservatives need to come together around an agenda grounded in the Principle of Climate Uncertainty and the Principle of Climate Politics. Congress may already be moving in this direction. Before the vote on cap and trade, ten Democratic Senators — the Technology Ten — sent an open letter to Senate President Harry Reid saying that they supported action on climate but that it needed to be focused on containing energy costs and on technology. Then, just last week, another group of ten Senators introduced compromise legislation on oil drilling, calling for expanded drilling and tens of billions in new government investments to make clean energy cheap and electrify the transportation sector. All of this has the makings for a new political center and, we hope, a political breakthrough.

The Conversation

Reply to Joseph Romm

It appears to me that Joseph Romm makes four core arguments in response to my essay:

1. Manzi is wrong on the facts about IPCC projections for expected warming.

2. The consequences of the actual amount of warming that we expect to experience are horrifying, and are likely to include 80 to 250 feet of sea level rise and a dust bowl from Kansas to California.

3. Martin Weitzman has shown that traditional cost-benefit analysis is not appropriate for the case of global warming.

4. The costs of aggressive emissions abatement are very small.

I believe that each of these arguments is incorrect, and I’ll try to explain why for each one in turn.

1. Manzi is wrong on the facts about IPCC projections for expected warming

In my essay, I said that:

The current IPCC consensus forecast is that, under fairly reasonable assumptions for world population and economic growth, global temperatures will rise by about 30C by the year 2100.

Here is my evidence in support of this assertion.

The IPCC takes the sensible position that establishing a “business-as-usual” (BAU) baseline for making long-term global climate projections is a pretty tricky endeavor because it means figuring out how the population and economy of the entire world is going to develop over the next century or more.  Therefore, rather than a single business-as-usual baseline, they have established a wide range of potential scenarios for how the world might develop.  They have declared six of these scenarios to be “marker scenarios” that collectively provide a representative spread of realistic possibilities.  Each of these scenarios assumes no implementation of the Kyoto Protocols or other climate initiatives.  The Working Group I Summary for Policymakers of the most recent IPCC Assessment Report (“WG1 SPM”) provides best estimates for the expected warming through about 2100 for each of these six scenarios (See Table SPM.3 on page 13).  These best estimates range from a low of 1.40C for Scenario B1 to a high of 4.00C for Scenario A1F1.  Each scenario is considered equally plausible by the IPCC.  The straight average of the six scenarios is 2.80C.  One of these six is Scenario A1B.  This scenario is characterized by “very rapid economic growth, global population that peaks in mid-century and declines thereafter…the rapid introduction of new technologies, …and the assumption that similar improvement rates apply to all energy supply and end-use technologies”.  It is often used in both IPCC and external documents as an informal mid-range case.  Scenario A1B also has a best estimate for warming by about 2100 of 2.80C.

Mr. Romm says that my statement is “not correct”, and then goes on to provide what he believes to be a more accurate representation of IPCC projections for expected levels of warming by 2100:

…the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F)…

Note that this asserted projection is much more severe than what I presented (5.50C vs. 30C), and unlike my assertion is based on a BAU trend.  He supports this assertion with a general link to this same WG1 SPM document that I have referenced and discussed.  I am pretty familiar with this document, and I don’t know of a single use of the term “business-as-usual” within it.  Further, to my knowledge every relevant use of the term “best estimate” in the document is either in the table of scenario projections that I referenced, or in Figure SPM.5 (which is a graphical representation of these same projections), or in the footnotes and explanatory text that surround these exhibits.  All of these uses of the term support the values for projected warming that I have provided.  Further, to my knowledge there is no marker scenario presented in this document or anywhere in the current IPCC Assessment Report under which the best estimate for warming by 2100 is 5.50C.  I do not understand the basis for Mr. Romm’s assertions.

I stand by my original statement as an accurate reflection of current IPCC consensus forecasts for expected global warming through 2100, though obviously I remain open to contradiction by relevant facts of which I am currently unaware.

2. The consequences of the actual amount of warming that we expect to experience are horrifying, and are likely to include 80 to 250 feet of sea level rise and a dust bowl from Kansas to California

Mr. Romm next provides a litany of very bad things that he predicts are likely to occur if we do not implement a set of policies to aggressively restrict emissions.

Consider Mr. Romm’s first prediction:

  • Sea level rise of 80 feet to 250 feet at a rate of 6 inches a decade (or more).

    A 250 foot rise in sea level would be very, very bad.  It conjures up the image of massive tidal waves consuming major coastal cities.  But this forecast is problematic for at least a couple of reasons.

    First, I don’t understand its internal logic.  At 6 inches per decade, it would take 1,600 years to get to 80 feet, and 5,000 years to get to 250 feet.  I assume we are not trying to consider effects in the years 3600 AD to 7000 AD.

    Second, while Mr. Romm doesn’t provide attribution for this forecast, it isn’t anything like any formal prediction I’ve ever seen from the IPCC.  The IPCC is appropriately reserved about making predictions for sea level rise.  They don’t provide best estimates by scenario, but just broad ranges.  They also caution that the high end of any range is not an absolute maximum.  The spread of IPCC projections for sea level rise through about 2100 across all six marker scenarios ranges from a low of 0.18 meters to a high of 0.59 meters.  (See WG1 SPM, Table SPM.3)  Crudely, these average about 0.4 meters, or about 15 inches total by about 2100.  That’s quite a ways from 80 to 250 feet.

    Consider Mr. Romm’s second prediction:

    Mr. Romm links to a blog post about a news report about a single climate simulation model study to support this.  (Read the comments section on the page to which Mr. Romm links).  The IPCC certainly does make predictions that large amounts of land, especially in Latin America (See WG2 Chapter 13, pages 583 and 597) and some other equatorial areas, will be subject to “desertification processes,” and this would be a serious problem, but I have not seen a claim as dramatic as Mr. Romm’s proceed from the IPCC process.  Consider this in the context of Mr. Romm’s closely related prediction, also based on a blog citation of another single study using climate simulation models, for aridity impacts for the U.S.:

    …”a permanent drought by 2050 throughout the Southwest” – levels of aridity comparable to the 1930s Dust Bowl would stretch from Kansas to California. And they were only looking at a 720 ppm case!

    This is a pretty bleak picture of what, for example, the U.S. will look like: a 21st century army of Joads escaping an interior apocalypse, only to arrive at the coasts to find the cities flooded under hundreds of feet of water.

    In contrast, here is the complete list of “examples of projected impacts” that the IPCC provides to illustrate the effects of climate change for the continent of North America (See Synthesis Report, Summary for Policymakers, page 11):

    • Warming in western mountains is projected to cause decreased snowpack, more winter flooding and reduced summer flows, exacerbating competition for over-allocated water resources.
    • In the early decades of the century, moderate climate change is projected to increase aggregate yields of rain-fed agriculture by 5 to 20%, but with important variability among regions.  Major challenges are projected for crops that are near the warm end of their suitable range or which depend on highly utilized water resources.
    • Cities that currently experience heat waves are expected to be further challenged by an increased number, intensity and duration of heat waves during the course of the century, with potential for adverse health impacts.
    • Coastal communities and habitats will be increasingly stressed by climate change impacts interacting with development and pollution.

    While serious, this doesn’t sound quite so dramatic.

    And so on with various other of Mr. Romm’s predictions.  The negative consequences of a several degree increase in global temperatures would almost certainly be quite significant.  In addition, these consequences would likely be felt disproportionately by poor people in developing countries, especially those near the equator.  Exaggerating these effects, however, is not helpful.  Remember that when you add up all the specific impacts of the type that we have been reviewing here, the total economic costs at  40C of warming — which we would expect to reach sometime into the 22nd century — are estimated by the IPCC to be about 1 – 5% of global GDP (See WG2 SPM, page 17)

    More centrally, we should not forget that there is a trade-off involved if we have to give up economic growth to avoid some of these problems.  This can sound pretty dry, but consider the untold stories on the other side of the equation that would result from many millions of people being less able to escape poverty, or at a minimum, escaping poverty more slowly.  What about all of the additional children who will die from dysentery between 2030 and 2070 because their communities couldn’t afford to put in improved sanitation and drainage systems that would have been installed had economic growth not been reduced as a result of carbon rationing?  What about all of the elderly patients who had earlier and more painful deaths because of the new hospital wing that didn’t get built in 2020? Such a list could go on almost indefinitely.  Indur Goklany’s response to my essay provides an excellent review of many of these consequences.  Certainly, it appears that the major developing countries of the world are voting with their actions about where they stand on this trade-off.

    We shouldn’t let the almost cinematic rhetoric of climate change disaster stories lead us to abandon rational consideration of costs and benefits.

    3. Martin Weitzman has shown that traditional cost-benefit analysis is not appropriate for the case of global warming

    Or is it rational to abandon cost-benefit analysis in this case?  Mr. Romm says so.  He argues that this has been “shown” (note, not “argued” or “asserted”, but “shown”):

    The traditional economic modeling work by Yale’s William Nordhaus cited by Manzi is quite irrelevant in the face of 1,000 ppm warming. In fact, even a 3% chance of a warming this great is enough to render useless all traditional cost-benefit analyses that argue for delay or only modest action, as Harvard economist Martin Weitzman has shown.

    Professor Weitzman’s reasoning on this topic is subtle and technically ingenious.  In my opinion, it is the most rigorous existing argument for a carbon tax.  Addressing it in detail is beyond the scope of these comments, but I have previously responded to a slightly earlier version of it in a long online article.  Alternatively, you can watch a video of Professor Weitzman presenting his paper, and then my response to it in the exact same room a few months later.  I encourage anybody who is serious about the climate change debate to understand Weitzman’s logic in detail.

    In very short form (recognizing that I will write somewhat loosely for purposes of brevity in this setting), Weitzman’s central claim is that the probability distribution of potential losses from global warming is “fat-tailed”, or includes high enough odds of very large amounts of warming (200C or more) to justify taking expensive action now to avoid these low probability / high severity risks.

    The big problem with this argument, of course, is that the IPCC has already developed probability distributions for potential warming that include no measurable probability for warming anywhere near this level for any marker scenario.  See, for example, WG1 SPM Figure SPM.6.  Even the scale on these charts — never mind actual predictions for the high end of the probability distributions — doesn’t go past 80C.  That is, the best available estimates for these probability distributions are not fat-tailed in the sense that Weitzman means it.

    Now, one can responsibly question the probability distributions developed by the IPCC.  A modest version of this is simply to recognize that we are not certain that they are correct.  But this is just a sophisticated restatement of one predicate of the Precautionary Principle.  That is, as I put it in my essay, we must logically accept the possibility that “even the outer edge of the probability distribution of our predictions for global-warming impacts is enormously conservative.”

    Of course, this is true in principle for all probability distributions, and therefore for all risks.  As Weitzman himself clearly recognizes, in order for him to distinguish climate change dangers from other dangers in this matter, he must therefore show not only that it is possible that the true probability distribution of potential levels of warming is actually much worse than believed by the IPCC, but that a reasonable observer should accept it as likely that this is the case.  In order to do this, he is forced to do his own armchair climate science, and argue (as he does explicitly in the paper) that he has developed a superior probability distribution for expected levels of warming than the ones the world climate-modeling community has developed and published.  As noted above, this probability distribution is radically more aggressive than anything you will find in any IPCC Assessment Report.  I don’t think that it is credible to accept Professor Weitzman’s climate science in place of the IPCC’s.

    4. The costs of aggressive emissions abatement are very small

    Mr. Romm says of atmospheric concentration of carbon dioxide that:

    …our choice is really to stay below 450 ppm or risk self-destruction.

    He goes on to say:

     The good news is that a host of independent analyses, including from the IPCC itself, make clear that the cost of keeping carbon dioxide concentrations at or below 450 ppm is very low indeed.

    He supports this by quoting the IPCC as saying:

    In 2050, global average macro-economic costs for mitigation towards stabilisation between 710 and 445 ppm CO2-eq… corresponds to slowing average annual global GDP growth by less than 0.12 percentage points. [Bold in original]

    This is somewhat problematic as support for Mr. Romm’s assertions, for at least a couple of reasons.

    First, as quoted, this isn’t the cost to “stay below 450 ppm,” but is the cost to end up somewhere between 710 and 445 ppm.  A crude average is 578 ppm.  The current atmospheric concentration of carbon dioxide is estimated to be about 385 ppm.  According to Mr. Romm, the world must therefore limit emissions sufficiently to prevent an increase of more than 65 ppm in this concentration index, or else “risk self-destruction.”  But this cost level would allow concentration to rise 193 ppm, or about three times this amount.  Presumably, it would cost significantly more to remain at or below the 450 ppm target that Mr. Romm has put forward.

    Second, the impact of differences in growth rates can be very counter-intuitive when compounded over many years.  Consider a grossly simplified illustration of a society with constant population that has an average income of $10,000 per person today.  If their economy grows at 2% per year for a century, average income will rise to a little over $72,000 after 100 years.  Now suppose that they predicted that a by-product of this growth would be to create climate damage that by the end of a century would cause them to lose 3% of GDP by the 100th year.  In this case, projected average income in year 100 would be reduced to about $70,000.  Now, suppose some smart technologists offered them the option to eliminate all climate damage in return for slowing GDP growth by only 0.12 percentage points every year over the century.  Should they take this deal?  If they did, GDP would grow by 1.88% each year, and average income in year 100 would be a little over $64,000.  Given these two alternatives, they would be much better off in economic terms by growing the economy at 2% and just taking the pain of the climate damage.

    This example brings us all the way back to the start of my essay, and the central problem for advocates of aggressive emissions abatement advocates: despite the rhetoric, the projected damages from global warming just don’t appear to justify the costs of the proposed remedy.

    Reply To Indur Goklany

    Indur Goklany and I seem to be in a state of violent agreement.

    I view his response essay to be in large part an outstanding quantitative review of the human implications of the trade-offs between carbon dioxide emissions and economic growth. Goklany proceeds from the premise, which I share, that the primary reason we care about climate change in the first place is its potential impact on human flourishing. He makes the point, forcefully and with careful analysis, that giving up economic growth doesn’t only mean slightly smaller SUVs, but reduced lifespans, nutrition, housing improvements, and so on. When denominated by such indicators, all responsible projections indicate that we expect to be made worse off by coercive policies to force immediate, aggressive abatement of carbon dioxide emissions.

    I will leave the (in my view logically separable) argument about how much the developed world should spend to reduce malaria, starvation and other global ills to another time. Mr. Goklany argues that at whatever level we choose to address these, forcing reduction of carbon dioxide emissions is not a smart way to go about it. I agree.

    What Manzi Ignored and Why It Matters

    It is a bit tricky to respond to Manzi’s comments on my essay, because he chose to selectively ignore the evidence I presented and thus never rebutted my central points. So I will elaborate here on the discussion he ignored, since it goes to the heart of why he is wrong, why Goklany is wrong, and why Shellenberger and Nordhaus are wrong.

    Manzi’s central point on warming is that “The current IPCC consensus forecast is that, under fairly reasonable assumptions for world population and economic growth, global temperatures will rise by about 3°C by the year 2100.”

    That is simply an incorrect statement, as I explained in my original post. Manzi is under the serious misimpression that the IPCC scenarios like B1 or A1F1 either separately or averaged together, represent a consensus or business-as-usual forecast, when in fact most of those scenarios assume the kind of aggressive action to deploy clean energy technologies that he does not support. Moreover, since 2000, the growth rate of actual global carbon emissions have exceeded even IPCC’s most extreme A1F1 scenario — so again it makes little sense to average all of IPCC’s scenarios to get business as usual or even a “fairly reasonable assumptions for world population and economic growth.”

    Manzi might try reading Pielke et al.’s “Dangerous Assumptions,” in Nature — an analysis that I don’t entirely agree with — to understand where he went wrong. Manzi then writes:

    Mr. Romm says that my statement is “not correct,” and then goes on to provide what he believes to be a more accurate representation of IPCC projections for expected levels of warming by 2100:

    “…the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F)…”

    Note that this asserted projection is much more severe than what I presented (5.5°C vs. 3°C), and unlike my assertion is based on a BAU trend. He supports this assertion with a general link to this same WG1 SPM document that I have referenced and discussed. I am pretty familiar with this document… .

    No, I do not support this assertion with a general link to this same WG1 SPM document. It is a bit unusual for Manzi to try this kind of selective editing in an online debate where everybody can see what he did. Let me reprint what I actually wrote

    As I explained in a recent Nature online article, the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F), which means that over much of the inland United States, temperatures would be about 15°F higher.

    It is my recent Nature online article that offers the explanation. Nature online asked me to write it in reply to the Pielke et al. piece and it explains the remarkable implications of the latest IPCC reports that most people, including Manzi, missed.

    Buried on page 16 of the Working Group 1 Summary for Policymakers is perhaps the most alarming yet under-reported paragraph in the entire document, which I summarize in the Nature article.

    According to the IPCC’s Fourth Assessment Report in 2007, model studies based on our current understanding of climate-carbon-cycle feedbacks suggest that to stabilize carbon dioxide levels at 450 ppm could require that cumulative emissions over the twenty-first century reach only about 490 gigatonnes of carbon (GtC), which equates to less than 5 GtC per year.

    Similarly, stabilizing atmospheric carbon dioxide levels at 1,000 ppm would require cumulative emissions this century of only about 1,100 GtC. In other words, if annual emissions average 11 GtC this century, we risk the real, terrifying prospect of seeing 1,000 ppm carbon dioxide in the atmosphere and and a ‘best estimate’ warming of a staggering 5.5 °C by the end of the century.

    Carbon emissions from the global consumption of fossil fuels are currently above 8 GtC per year and rising faster than the most pessimistic economic model considered by the IPCC. Yet even if the high price of energy from fossil fuels and power plants combines with regional climate initiatives to slow the current rate of growth somewhat, we will probably hit 11 gigatonnes of carbon emissions per year by 2020.

    The first question to ask is — what is the IPCC’s ‘best estimate’ warming for 1,000 ppm of CO2? Here I used the wrong link in my essay. This does not come from the WG1 Summary for Policymakers. The right link is the WG1 Technical Summary. On page 66 of the Technical Summary, a table lists the “best estimate” warming for different atmospheric levels of carbon dioxide equivalent, which is obviously going to be higher than the atmospheric level of carbon dioxide alone. In any reasonable business-as-usual case where there is no greenhouse gas constraint, then you would expect steady rises in the levels of other greenhouse gas emissions.

    The best estimate warming for 1,000 ppm of CO2eq is 5.5°C and the best estimate for 1,200 ppm of CO2eq is 6.3°C. Given how much effort will be required to merely stabilize atmospheric concentrations of CO2 alone at 1,000 ppm (discussed below), I think it is very safe to say that total business-as-usual warming is at least 6.3°C and that 5.5°C is conservative.

    How fast could this happen? Again, climate scientists don’t spend a lot of time studying this nightmare scenario because they can’t imagine humanity would be so stupid as to let it happen. But as I cited in my original essay, one very credible model suggests we could hit 1,000 ppm of CO2 in 2100 with a total warming from preindustrial levels in 2100 of about 5.5°C.

    Manzi writes,

    Further, to my knowledge there is no marker scenario presented in this document or anywhere in the current IPCC Assessment Report under which the best estimate for warming by 2100 is 5.5°C. I do not understand the basis for Mr. Romm’s assertions.

    Well, there may not be a “marker scenario” but anyone who reads the full IPCC report can clearly see that the report raises the very serious prospect we may hit 1,000 ppm of CO2 and asserts that would probably warm the planet 5.5°C.

    The second question to ask is: What would be required merely to keep global emissions frozen at 11 GtC through most of the rest of the century starting in 2020? As I explain in the Nature piece:

    I use the “stabilization wedges” approach put forward by Robert Socolow and Stephen Pacala of Princeton University to answer that question qualitatively. As Socolow and Pacala explain “A wedge represents an activity that reduces emissions to the atmosphere that starts at zero today and increases linearly until it accounts for 1 GtC/year of reduced carbon emissions in 50 years.” So the planet would need 11 wedges to keep emissions flat at 11 GtC per year from 2020 to 2070.

    I then offer one possible list of 11 wedges:

    • Concentrated solar thermal electric: 1,600 gigawatts peak power
    • Nuclear: 700 new gigawatt-sized plants (plus 300 replacement plants)
    • Coal: 800 gigawatt-sized plants with all the carbon captured and permanently sequestered
    • Solar photovoltaics: 3,000 gigawatts peak power
    • Efficient buildings: savings totaling 5 million gigawatt-hours
    • Efficient industry: savings totaling 5 million gigawatt-hours, including co-generation and heat recovery
    • Wind power: 1 million large wind turbines (2 megawatts peak power).
    • Vehicle efficiency: all cars 60 miles per U.S. gallon
    • Wind for vehicles: 2,000 gigawatts wind, with most cars plug-in hybrid-electric vehicles or pure electric vehicles
    • Cellulosic biofuels: using up to one-sixth of the world’s cropland
    • Forestry: end all tropical deforestation

    Each of these wedges represents a staggering amount of effort by both the public and private sectors. For instance, one wedge of coal with carbon capture and storage represents a flow of carbon dioxide into the ground equal to the current flow of oil out of the ground. It would require, by itself, recreating the equivalent of the planet’s entire oil delivery infrastructure over the course of five decades.

    Obviously this is not business as usual. Indeed, it is very safe to say this won’t occur without a very aggressive collective effort by the nations of the world, an effort that the Manzis and Goklanys of the world do not support. But here is the really scary part:

    Achieving all 11 wedges would still keep us on a path towards atmospheric levels of 1,000 ppm of CO2eq by 2100, some 5.5°C total warming from preindustrial levels and a variety of catastrophic impacts, including the extinction of most species, the desertification of one-third of the planet, and a return to temperatures not seen since the Earth was ice-free and sea levels were 250 feet higher (points I will return to in Part 2).

    What must America and the rest of the world do to avert this catastrophe? Obviously we must do much more then deploy 11 wedges from 2020 to 2070. As I noted in Nature:

    If we are to have confidence in our ability to stabilize carbon dioxide levels below 450 ppm emissions must average less than 5 GtC per year over the century. This means accelerating the deployment of the 11 wedges so they begin to take effect in 2015 and are completely operational in much less time than originally modeled by Socolow and Pacala, say in 25 years. As a result, in 2040 global emissions would be at about 4 GtC per year. We would then have six decades to cut emissions in half again (or by more if the science deems it necessary), which would require an equally impressive effort.

    Needless to say, that requires government action of a scale that Manzi and Goklany oppose, and that the wishful thinking of Shellenberger and Nordhaus is exceedingly unlikely to achieve.

    Seven Points in Response to Romm

    1. Romm’s faith in model results is not warranted, as noted in my previous post.

    2. Romm’s argument is that in recent years annual emissions have been higher than assumed in the IPCC’s worst case (A1FI) scenario, and that we are on track to get to 1,000 ppm in 2100. However, instead of looking at emissions it might make more sense to look at real measurements of atmospheric CO2 concentrations for two reasons. First, CO2 concentrations are more directly related to the greenhouse effect. Second, using measured atmospheric CO2 concentrations short circuits two layers of modeling which themselves are major sources of uncertainty, namely, estimating global emissions and, then, estimating the atmospheric CO2 concentrations (based on complex models of the global carbon cycle).

    3. What do recent CO2 measurements tell us? If we assume that CO2 concentration (not emissions) will grow (compounded) from 2007 to 2100 at the same annual rate as it did between 2000 and 2007, then CO2 will increase to 636 ppm in 2100 (calculated using data from NOAA). Also, assuming that the other greenhouse gases effectively increase equivalent CO2 by 20 percent (per the ratio calculated for 2005 in the IPCC’s Fourth Assessment Report; see p. 204 [pdf]), we get a CO2 equivalent concentration of 763 ppm in 2100. So, empirical information suggests a much lower future CO2 concentration than Romm fears.

    4. In any case, the impact estimates that I presented in my previous post were based on the worst case (A1FI) scenario which, according to the HadCM3 model, would increase CO2 concentrations to 810 ppm in 2085 and 970 ppm in 2100, and cause a 4°C increase in average global temperatures between 1990 and 2085. Despite that, and despite the fact that the impacts were themselves overestimated [pdf], the impacts analyses did not indicate the world would come to an end because of a 4°C increase in global temperature by 2085. Not only that, but the impacts of climate change were overshadowed by other factors (e.g., hunger, malaria, unsafe water, and other poverty-related problems). If climate change terrifies Romm, why is he not petrified by these other problems?

    5. Romm is so focused on climate change that he overlooks other problems that are not only much worse, and whose existence is indisputable, but which are also more amenable to solution. This, of course, is a very human failing: Lots of people believe that the problems they work on are more important than others.

    6. Romm assumes that a stabilization level of 450 ppm for CO2 is the correct target. What is the basis for this claim? What precise impacts would be avoided if we hit 450 vs. 1,000 ppm? How does he know that the costs of getting to this level would be justified?

    7. An earlier fast-track assessment (FTA) sponsored by the UK Government and undertaken by essentially the same group of scientists that produced the analysis that I reported on in my previous post compared the impacts of climate change for three cases: unconstrained (business as usual, BAU) emission case, CO2 stabilization at 750, and stabilization at 550 ppm. Surprisingly, for several categories of impacts (e.g., hunger, malaria, water stress, forested area), stabilization at 550 ppm made matters worse compared to stabilization at 750 ppm! For example, stabilization at 750 ppm reduced the total population at risk (PAR) for malaria in 2085 by a greater amount (1.3% below the BAU case) than stabilization at 550 ppm (0.4% below BAU). (See Goklany, IM (2005), “A Climate Policy for the Short and Medium Term: Stabilization or Adaptation?” [pdf] Energy & Environment 16: 667-680). Hence, there is no guarantee that stabilizing CO2 at 450 ppm would optimize human or environmental well-being. For all we know, stabilizing at 750 may be more optimal. Before advocating policies to get us to 450 ppm, at a minimum it ought to be shown that’s a goal worth pursuing, even absent consideration of opportunity costs. Unfortunately, Romm doesn’t do either.

    Surely You’re Joking, Mr. Romm

    I wrote an essay addressing the challenge of global warming. Joseph Romm subsequently raised four major objections in a response essay. I then reviewed why I believe that each of these four objections is misplaced. Mr. Romm has now responded by reiterating the first of these four objections. To review the bidding on this point, I had written in my original essay that:

    The current IPCC consensus forecast is that, under fairly reasonable assumptions for world population and economic growth, global temperatures will rise by about 3°C by the year 2100.

    Mr. Romm disputed this, saying that it was “not correct,” and instead that:

    …the latest IPCC report finds that, absent a sharp reversal of BAU trends, we are headed toward atmospheric levels of carbon dioxide far exceeding 1,000 parts per million by 2100. IPCC’s “best estimate” for temperature increase is 5.5°C (10°F)…

    I replied by, among other things, citing and explaining the specific document and table (WG1 SPM, Table SPM.3) that supports my assertion (the table seemingly straightforwardly labeled “Projected global average surface warming and sea level rise at the end of the 21st century”), and making the point that neither Mr. Romm’s assertion nor its material equivalent can be found in the document that he links to with the words “best estimate for temperature increase is 5.5°C (10°F).”

    Mr. Romm has now replied to this by correcting his link, and repeating his assertion that I am in error on this point. I strongly suspect that this dispute is, increasingly, of interest only to the two of us. I am tempted to leave it to the reader to make a decision between these viewpoints upon the evidence presented, but I think this is too critical a point to just let drop. Therefore, I’ll focus on what appears to me to be the core of Mr. Romm’s latest reply.

    In effect, Mr. Romm has presented his own forecast for warming and claimed that it is the IPCC’s best estimate for warming in a business-as-usual case.

    Here is Mr. Romm’s restatement of his argument that he is presenting the viewpoint of the IPCC:

    Buried on page 16 of the Working Group 1 Summary for Policymakers is perhaps the most alarming yet under-reported paragraph in the entire document, …

    if annual emissions average 11 GtC this century, we risk the real, terrifying prospect of seeing 1,000 ppm carbon dioxide in the atmosphere ….

    Carbon emissions from the global consumption of fossil fuels are currently above 8 GtC per year and rising faster than the most pessimistic economic model considered by the IPCC. Yet even if the high price of energy from fossil fuels and power plants combines with regional climate initiatives to slow the current rate of growth somewhat, we will probably hit 11 gigatonnes of carbon emissions per year by 2020.

    The first question to ask is — what is the IPCC’s ‘best estimate’ warming for 1000 ppm of CO2? Here I used the wrong link in my essay. This does not come from the WG1 Summary for Policymakers. The right link is the WG1 Technical Summary… . In any reasonable business-as-usual case where there is no greenhouse gas constraint, then you would expect steady rises in the levels of other greenhouse gas emissions.

    The best estimate warming for 1000 ppm of CO2eq is 5.5°C and the best estimate for 1,200 ppm of CO2eq is 6.3°C. Given how much effort will be required to merely stabilize atmospheric concentrations of CO2 alone at 1,000 ppm (discussed below), I think it is very safe to say that total business-as-usual warming is at least 6.3°C and that 5.5°C is conservative.

    How fast could this happen? Again, climate scientists don’t spend a lot of time studying this nightmare scenario because they can’t imagine humanity would be so stupid as to let it happen. But as I cited in my original essay, one very credible model suggests we could hit 1,000 ppm of CO2 in 2100 with a total warming from preindustrial levels in 2100 of about 5.5°C.

    Let me see if I’ve got this straight.

    Mr. Romm does not use any of the emissions scenarios that are the starting point for IPCC climate projections, but instead has developed a forecast for carbon emissions (which are “rising faster than the most pessimistic economic model considered by the IPCC”). Under Mr. Romm’s forecast, “we will probably hit 11 gigatonnes of carbon emissions per year by 2020.” He then further develops his emissions forecast with the assumption that “[i]n any reasonable business-as-usual case where there is no greenhouse gas constraint, then you would expect steady rises in the levels of other greenhouse gas emissions.” He asserts that this forecast is a “business as usual” baseline, which is a concept that the IPCC does not use.

    Next, in order to convert his emissions forecast into a forecast for warming, he cites an exhibit (Table TS.5, page 66) in an IPCC technical summary that relates atmospheric carbon dioxide concentration to equilibrium temperature change versus pre-industrial temperature, without mentioning the crucial fact that this is an equilibrium projection. In this context, “equilibrium” means roughly where temperature will eventually settle down in the long run, which can be a very long time. (See WG1 Glossary, page 945, for the distinction between “Equilibrium and transient climate experiments.”) So when Mr. Romm says that “[t]he best estimate warming for 1,000 ppm of CO2eq is 5.5°C,” this means that (1) such a concentration should eventually lead to such a temperature change, and (2) this temperature change is versus pre-industrial temperature, not versus today’s temperature. This is very far from the same thing as projected temperature change between today and 2100. Presumably recognizing that the world scientific community uses global climate models of some complexity to translate emissions scenarios into projections for warming by specific dates, he refers to “one very credible model” that “suggests we could” hit total warming from pre-industrial levels of about 5.5°C by 2100.

    In other words, Mr. Romm has used external sources to develop his own emissions business as usual baseline, and then used other external sources to create his own climate forecast based on this emissions forecast. I have great respect for Mr. Romm’s technical capabilities, and his forecast for warming may or may not turn out to be correct, but it is very far from being an IPCC forecast. If the information “buried” in the Summary for Policymakers in combination with the referenced table in the technical summary demonstrates Mr. Romm’s point, this appears to have escaped the authors and editors of the IPCC Assessment Report.

    As a separate point, Mr. Romm says that:

    Manzi is under the serious misimpression that the IPCC scenarios like B1 or A1F1 either separately or averaged together, represent a consensus or business-as-usual forecast, when in fact most of those scenarios assume the kind of aggressive action to deploy clean energy technologies that he does not support.

    Here is what the IPCC says about this (WG1 SPM, page 18):

    The SRES scenarios do not include additional climate initiatives, which means that no scenarios are included that explicitly assume implementation of the United Nations Framework Convention on Climate Change or the emissions targets of the Kyoto Protocol.

    Or see WG1 Chapter 10, Table 10.26, page 803 which defines all marker scenarios as “non-mitigation” scenarios.

    Or see the IPCC Special Report on Emissions Scenarios, Section 4.4.7 for detailed assumptions for technological change by scenario.

    In my essay, I did not render an opinion on “aggressive action to deploy clean energy technologies” per se. Any reasonable scenario for global development over the next century is likely to project technological change, which as we have seen in the past century would likely include changes in energy creation and consumption technologies, as well as the propagation of the kind of normal environmental policies that we have seen in the developed world in the past century, such as control of sulfur pollution. The specific actions that I opposed in my essay were: (1) a cap-and-trade system for carbon emissions, (2) a carbon tax, and (3) $5 trillion of U.S. government spending between now and 2100 on technology designed to reduce greenhouse gas emissions. None of these are assumed in any IPCC marker scenario.

    Goklany Okay with 250-Foot Sea-Level Rise

    Before I get to the classic disingenuous conservative argument that is the cornerstone of Goklany’s essay, I must first deal with one of the most astonishing statements ever seen in a serious climate debate. Goklany says:

    … there is no guarantee that stabilizing CO2 at 450 ppm would optimize human or environmental well-being. For all we know, stabilizing at 750 may be more optimal.

    Wow. The IPCC’s Fourth Assessment says, “As global average temperature increase exceeds about 3.5°C [relative to 1980 to 1999], model projections suggest significant extinctions (40-70% of species assessed) around the globe.” But really, who needs half the current species anyway? After all, Goklany says we should ignore the model projections of the hundreds of leading climate scientists in the world, and the inclusions that every member government signed off on word for word, and instead trust him and a handful of other non-experts that everything will be hunky-dory.

    In fact, we know three things with very high certainty:

    1. Modern human civilization has developed and thrived in the last 10,000 years during a very narrow window of temperature and CO2 concentrations, a very small fluctuation around 280 ppm.

    2. Stabilizing at 750 ppm, even if that were possible, would bring the Earth to temperatures last seen when the planet was virtually ice-free, ultimately leading to sea levels 80 meters higher and the loss of all inland glaciers.

    3. A 750-ppm world may be “optimal” in some alternative universe, but would ultimately lead to several billion environmental refugees here on Earth and unimaginable misery for the rest of humanity.

    I dare say that Goklany is one of the few climate analysts in the world who has ever posited that 750 ppm of CO2 might be “more optimal” for human or environmental well-being — given that we have been very close to one-third that level for 10,000 years.

    Here is a figure from Bob Correll, former head of the Arctic Climate Impact Assessment:

    [Click image for larger version]

    So I think we do in fact know what the “optimum” temperature window is for modern human civilization — and we are about to find out the unpleasant consequences of breaking through that window.

    In the figure, the IPCC (2007) forecast of 2°C to 3°C warming by 2100 is based on stabilizing atmospheric concentrations of CO2 around 550 ppm (a doubling from pre-industrial levels of 280), up from 385 today. The “band of uncertainty” involves the uncertainty about the climate sensitivity to a doubling of CO2 (absent the slow feedbacks). But the longer-term climate sensitivity to a doubling is probably much higher (see here).

    In any case, the IPCC makes clear that 750 ppm would result in total warming from preindustrial levels of 4°C by 2100. Now, as an important aside, it is quite doubtful one could actually stabilize at 750 ppm, since work by the National Center for Atmospheric Research and the Hadley Center suggest that carbon cycle feedbacks, like the defrosting of the tundra or the die-back of the Amazon rain forest, would release greenhouse gas emissions that would take the planet to much higher levels. This was a key point that I made in my first post, which neither Manzi nor Goklany refuted.

    The last time the earth was 4°C warmer, it was essentially ice-free. Being ice-free would have two rather “nonoptimal” outcomes for humans and the other surviving species. First, long before the two great ice sheets, Greenland and Antarctica, disappeared, we would lose all of the inland glaciers that currently provide most of the water for about a billion people. I’m guessing those folks would probably see that as nonoptimal.

    As for sea-level rise, I defy Manzi or Goklany to find a single serious climate expert who thinks the ice sheets would survive 4°C warming. I will address this point in a separate response to Manzi, but Goklany makes the classic mistake of thinking that stabilizing concentrations at 750 ppm means the climate shifts to a new, static state.

    In fact, the latest scientific research suggests that what would almost certainly happen at those concentrations of CO2 is that, in the second half of this century, sea levels would start rising several inches a decade, which would quickly become 6 to 12 inches a decade. Ultimately, we would probably hit 20 inches a decade, the rate at which sea levels rose during the last deglaciation. (The rate of warming Goklany thinks could be a new optimum for the planet is comparable to the rate of warming that we had at the end of the last Ice Age.) This rate of sea level rise would continue for centuries, although it would probably be marked periodically by large surges in sea levels as various large chunks of the ice sheets disintegrated.

    The point is not whether 450 ppm would optimize human or environmental well-being. The point is to stay as close as is possible to the conditions under which modern civilization developed.

    Goklany asks what is the basis for the claim that 450 ppm — or a maximum of 2°C warming or a 50% cut in global greenhouse gas emissions by midcentury — is the “correct target” for humanity. The answer is the entire body of work of the IPCC, along with the statement of the National Academies of Sciences of the leading countries in the world, the American Geophysical Union, the Bali Declaration from more than 200 of the world’s leading climate scientists, and on and on.

    Let me end by addressing Goklany’s core point: the classic disingenuous conservative argument that there are better things to spend money on than climate mitigation. This argument assumes 1) there isn’t enough money to do everything whose benefits exceed their cost and 2) conservatives would actually support things like government-funded efforts to deal with “hunger, malaria, unsafe water, and other poverty related problems.”

    I worry a lot about those problems and think we should be spending a lot more money on them. I have no doubt we would be spending a lot more money on poverty-related problems if it weren’t for conservatives and libertarians in Congress and the White House who consistently oppose such efforts and cut funding for such programs.

    Goklany’s position that there are better things to spend our money on than mitigating climate change is almost identical to the position of Danish statistician Bjørn Lomborg based on the Copenhagen Consensus project. But as one of the authors of the Copenhagen Consensus Project’s principal climate paper, Gary Yohe, very recently wrote, “I can say with certainty that Lomborg is misrepresenting our findings thanks to a highly selective memory”:

    The IPCC’s message is clear: climate change is real, compelling and urgent — and we need a concerted, comprehensive and immediate effort to confront it….

    Lomborg claims that our “bottom line is that benefits from global warming right now outweigh the costs” and that “[g]lobal warming will continue to be a net benefit until about 2070.” This is a deliberate distortion of our conclusions.

    We did find that climate change will result in some benefits for developed countries, but only for modest climate change (up to global temperature increases of 2C — not the 4 degrees that Lomborg is discussing in his piece). But developed countries are relatively prepared to handle climate change’s effects — they tend to be in colder areas, and they have the infrastructure to mitigate severe depletion of resources like fresh water and arable land.

    That is precisely why our analysis concluded — and Lomborg ignores — that climate change will cause immediate losses for developing countries and the planet’s most vulnerable, millions of whom are already facing challenges that climate change will exacerbate.

    So again, contrary to what Goklany asserts, if we can limit total warming to 2°C — the basis of the 450 ppm target — we could avert the worst outcomes.

    As I made clear in my first post, failing to stabilize near or below 450 ppm risks crossing carbon cycle tipping points that will take us to 750 to 1000 ppm and warming of 4°C to 6°C. Notwithstanding a few papers that Goklany has dug up, the overwhelming majority of climate scientists recognize such warming as an unmitigated disaster for humanity — and indeed for all species — for generations to come.

    Invest in America

    Four years ago we argued in “The Death of Environmentalism” that greens didn’t need to win the debate over the relative seriousness of global warming in order to enact policies capable of dealing with it. At the time, that claim was viewed as paradoxical and even heretical. But further evidence has been provided here. While Joe Romm, Jim Manzi, and Indur Golanky cannot agree on even the basic facts of climate change, by focusing on solutions we have been able to identify at least seven principles that we share in common with Manzi:

    1. Energy and climate policy should be robust to uncertainty, not certainty.
    1. We cannot price, or regulate, our way to a clean energy economy.
    1. Government investment to mitigate global warming is needed.
    1. Many small and large technological breakthroughs are needed to make clean energy cost-competitive with fossil fuels.
    1. Government investment in R&D should aim to bring down the real, unsubsidized price of clean energy.
    1. Making clean energy cheap will require, in Manzi’s words, “complex coordination of many actors, responding to changing circumstances and trade-offs, and decades of effort.”
    1. Advancing the effort to make clean energy cheap will require ongoing technical analysis of specific projects so that public money is well spent.

    Before moving to points of disagreement, it is worth pausing to consider that these principles may open up possibilities for bipartisan legislation on energy and climate if we can avoid old ideological quagmires.

    We see the following as points of disagreement between our position and Manzi’s:

    1. Manzi opposes any carbon tax and any cap-and-trade policy, while we support a modest carbon or electricity tax to fund investment in technology, and/or cap-and-trade as long as it: a) dedicates 100 percent of the revenue raised from auctioning permits to developing and deploying clean energy technology; b) contains cost-containment measures to prevent the carbon price (and thus the price of energy) from rising so high as to either significantly slow economic growth or trigger a public backlash.
    1. Manzi opposes while we support government investment in the deployment of promising clean energy technologies, from wind and solar to carbon capture and storage, as part of a coordinated strategy to make clean energy cheap.
    1. We support an annual public investment in the range of $50 billion whereas Manzi supports annual investment in the range of $5 billion.

    While the $45 billion difference between Manzi’s position and ours is indeed large, it is also worth recognizing that it is a relatively small amount per capita: $150.

    Why do we believe the government should make large investments in deploying clean energy technologies? There are several reasons. Technology breakthroughs in energy often come through deployment, not R&D in laboratories. We cited the Danish experience subsidizing the deployment of offshore wind turbines. Moreover, we pointed out that the capital-intensive nature of the energy sector, where new power plants can cost up to $5 billion, requires larger investments than, say, pharmaceutical or computer R&D efforts. And we pointed to the need for new transmission lines, which alone would require a public investment of $60 billion if we are to tap America’s wind assets and make wind power roughly 20 percent of the power generation sector.

    Rather than addressing these arguments, Manzi re-asserts an over-generalization: “political allocation of economic resources is almost never effective in overcoming long-term challenges.” The statement turns a blind eye to the history of economic development in the United States. In our essay we explicitly challenged Manzi to explain how America could have become so wealthy had it not invested in the railroads, the highways, the electrical grid, the Internet, microchips, the computer sciences, and the biosciences. Manzi never answered because there is no answer.

    It is nonsensical to speak of markets and governments operating separately. The state is what guarantees and protects private property. There is no possibility of extricating government from market activity, nor should we want to try, as government is crucial to regulating fair play in markets through laws and the courts. This is certainly the case with energy, which is crucial to national security and economic development and has thus always been heavily regulated. Attempts to deregulate the power sector often meet with disaster, as in California, where even “deregulation” is really regulation by a different name.

    The question is thus not whether but how the state should be involved. We think the public has a strong interest in making clean energy cheap and helping the United States become a global leader in the new clean energy industries, and that for this to happen we must make the muscular kinds of investments we made in the past in promising sectors from railroads and highways to computers and pharmaceuticals.

    We applaud Manzi’s call for ongoing dialogue informed by a more technical analysis, and we thank the editors of Cato Unbound for creating a forum where these important issues can be freely debated.

    Only Inaction Is Costly

    I think there is no point in further debunking the myth that unrestricted greenhouse gas emissions leads to tolerable impacts (or even a more optimal world). Even the incredibly centrist Brookings Institution now recognizes that:

    The Earth is on a trajectory to warm more than 4.5 degrees Fahrenheit by around mid-century. Exceeding that threshold could trigger a series of phenomena: Arable land will turn into desert, higher sea levels will flood coastal areas….

    Those who have any doubts about what we risk on our current emissions path should read my full discussion, “Is 450 ppm politically possible? Part 0: The alternative is humanity’s self-destruction.

    I think the more important question to address at the end of our discussion is the myth that action is costly. The IPCC’s Fourth Assessment Report, Working Group III, Summary for Policymakers concludes that stabilizing at 445-535 parts per million atmospheric concentration of CO2-equivalent would reduce average annual GDP growth rates by less than 0.12% per year. And that conclusion was signed off on by every member government, including China, Saudi Arabia, and the Bush administration. Note that this is CO2-equivalent.

    That means stabilizing at 350-440 ppm of CO2 would reduce annual GDP less than 0.12% per year. Moreover, the report finds:

    … in all analyzed world regions near-term health co-benefits from reduced air pollution as a result of actions to reduce GHG emissions can be substantial and may offset a substantial fraction of mitigation costs (high agreement, much evidence).

    Including co-benefits other than health, such as increased energy security, and increased agricultural production and reduced pressure on natural ecosystems, due to decreased tropospheric ozone concentrations, would further enhance cost savings.

    Integrating air pollution abatement and climate change mitigation policies offers potentially large cost reductions compared to treating those policies in isolation.

    And let’s not even count the enormous benefits of avoiding widespread desertification and catastrophic sea level rise.

    Two other major recent studies have found an even lower cost of action. First, the McKinsey Global Institute, for instance, did a comprehensive cost curve for global greenhouse gas reduction measures (reprinted below), which came to the stunning conclusion that the measures needed to stabilize emissions at 450 ppm CO2 have a net cost near zero. A new analysis, “The carbon productivity challenge: curbing climate change and sustaining economic growth,” has its own stunning conclusion:

    In fact, depending on how new low-carbon infrastructure is financed, the transition to a low-carbon economy may increase annual GDP growth in many countries.

    The new analysis explains that “at a global, macroeconomic level, the costs of transitioning to a low-carbon economy are not, in an economic ‘welfare’ sense, all that daunting — even with currently known technologies.” Indeed, 70% of the total 2030 emissions reduction potential (below $60 a ton of CO2 equivalent) is “not dependent on new technology.”

    The final reality is perhaps the most important:

    The macroeconomic costs of this carbon revolution are likely to be manageable, being in the order of 0.6–1.4 percent of global GDP by 2030. To put this figure in perspective, if one were to view this spending as a form of insurance against potential damage due to climate change, it might be relevant to compare it to global spending on insurance, which was 3.3 percent of GDP in 2005. Borrowing could potentially finance many of the costs, thereby effectively limiting the impact on near-term GDP growth. In fact, depending on how new low-carbon infrastructure is financed, the transition to a low-carbon economy may increase annual GDP growth in many countries.

    I am reprinting the cost curve here, because MGI have provided a much bigger version of it (click to enlarge):

    mgi-cost-curve-small.jpg

    The report notes that “we have been fairly conservative in our assumptions about technological progress in these projections.” For instance, the analysis appears to ignore the enormous potential of concentrated solar thermal electricity entirely (see “Concentrated solar thermal power — a core climate solution”).

    Finally, the normally conservative International Energy Agency (IEA) also makes clear the cost of action is low in its recent report, “Energy Technology Perspectives, 2008″ (Exec. Sum. here). In all the scenarios the IEA considers,

    … the estimated total undiscounted fuel cost savings for coal, oil and gas over the period to 2050 are greater than the additional investment required (valuing these fuels at Baseline prices). If we discount at 3%, fuel savings exceed additional investment needs in the ACT Map scenario [in which CO2 emissions in 2050 only return to 2005 levels].

    In the BLUE Map (i.e. 450 ppm) scenario, where CO2 emissions in 2050 go to half of 2005 levels, we get “oil demand actually 27% less than today in 2050.

    The report warns of the high cost of inaction:

    Unsustainable pressure on natural resources and on the environment is inevitable if energy demand is not de-coupled from economic growth and fossil fuel demand reduced.

    The situation is getting worse… . [T]oday’s best estimates under our “business as usual” baseline scenario foreshadow a 70% increase in oil demand by 2050 and a 130% rise in CO2 emissions…. a rise in CO2 emissions of such magnitude could raise global average temperatures by 6°C (eventual stabilisation level), perhaps more. The consequences would be significant change in all aspects of life and irreversible change in the natural environment.

    Those who not aggressively pushing for sharply and rapidly reversing our emissions trend with a goal of stabilizing at 450 ppm CO2 or less are simply inviting the self-destruction of modern civilization.

    The Earth is Okay with a 400-Foot Sea-Level Rise

    Whether or not Goklany is OK with a 250-foot sea level rise (SLR) — thank you, Mr. Romm, for putting words in my mouth! — the earth is OK despite a 400-foot rise since the last ice age. This translates into an average SLR of 22 feet per 1,000 years. The peak rate of SLR was undoubtedly much greater. So one must ask: What were the consequences of such a rapid rate of rise, and what do they tell us about the resilience of the rest of nature?

    Mr. Romm says sea level could rise by 250 feet. But instead of being terrified by climate change (to use his word), let’s try some rational risk analysis.  What is this estimate based upon? Over what period of time is this rise supposed to take place? It makes a difference whether it’s decades, centuries, or millennia. He also confuses a possible geological catastrophe (such as melting of ice sheets) with a real socioeconomic catastrophe.  But a geological catastrophe does not necessarily imply a socioeconomic catastrophe, unless society is immobilized.

    This is what the IPCC’s WG I SPM (p. 17) says about the Greenland Ice Sheet, “If a negative surface mass balance were sustained for millennia, that would lead to virtually complete elimination of the Greenland Ice Sheet and a resulting contribution to sea level rise of about 7 m.” [Emphasis added.] Presumably the same applies to other ice sheets.

    But where is the showing that a negative surface mass balance will, in fact, be sustained for millennia? Even if climate models were perfect — and we know they are not, which is why they are called “models” — one would have to question the validity of any such exercise. What socioeconomic scenario is it based upon? What is assumed regarding the sum total of fossil fuels available to humanity? How long are we assumed to stay primarily reliant on fossil fuels, considering that cheaper renewables are supposedly around the corner, if not here already? How likely is it that such scenarios can be forecast with any confidence beyond a few years, let alone millennia?

    Second, if it’s millennia, as the IPCC says, or even centuries, that gives us ample time to adjust, albeit at substantial socieconomic costs. But it need not be prohibitive or dangerous to life and limb if: (1) the total amount of SLR and, perhaps more importantly, the rate of SLR can be forecast with some confidence; (2) the rate of SLR is slow relative to how fast populations can strengthen coastal defenses and/or relocate; and (3) there are no insurmountable barriers to migration.

    Consider, for example, that Lowe, et al. [in Avoiding Dangerous Climate Change, H.J. Schellnhuber et al. (eds), Cambridge University Press, Cambridge, 2006, p. 32-33], based on a “pessimistic, but plausible, scenario in which atmospheric carbon dioxide concentrations were stabilised at four times pre-industrial levels,” estimated that a collapse of the Greenland Ice Sheet would over the next 1,000 years raise sea level by 2.3 meters (with a peak rate of 0.5 mm/yr). If one were to arbitrarily double that to account for potential melting of the West Antarctic Ice Sheet, that would mean a SLR of ~5 meters in 1,000 years (with a peak rate assuming the peaks coincide) of 1 meter per century.

    Human beings, unless terrified into immobility, can certainly get out of the way of such a sea level rise, especially if they have centuries or a millennium to do so.  In fact, if they can get early warning of such an event they could probably get out of the way in a matter of decades, if not years.

    Can a relocation of such a magnitude be accomplished?

    Consider that the global population increased from 2.5 billion in 1950 to 6.7 billion this year. Among other things, this meant creating the infrastructure for an extra 4.2 billion people in the intervening 58 years (as well as improving the infrastructure for the 2.5 billion counted in the baseline, many of whom barely had any infrastructure in 1950). These improvements occurred at a time when everyone was significantly poorer. Therefore, while relocation will be costly, in theory today’s — and, more to the point, tomorrow’s — wealthier world ought to be able to relocate billions of people to higher ground over the next few centuries, if need be. In fact, once a decision is made to relocate, the cost differential of relocating, say, 10 meters higher rather than a meter higher is probably marginal.  [Oh, I almost forgot — if a restructuring of the U.S. energy infrastructure can create 5 million jobs, imagine how many more jobs such a relocation would create.  Perhaps we should count such a massive effort as a benefit rather than a cost!]

    What about the rest of nature?

    As noted, the sea level has risen 400 feet in the past 18,000 years, and the rest of nature doesn’t seem any worse for wear.

    So on sea level rise, I’d recommend rational risk analysis to Mr. Romm, because we have nothing to fear but fear itself.

    Second, Romm makes assertions and rhetorical flourishes but doesn’t furnish any proof showing that 450 ppm is necessarily better than 750 ppm.  He says that

    Notwithstanding a few papers that Goklany has dug up, the overwhelming majority of climate scientists recognize such warming as an unmitigated disaster for humanity — and indeed for all species — for generations to come.

    Romm doesn’t have even a few papers to support either his explicit claim that 450 ppm will be better than 750 ppm or his implicit claim that climate change is more urgent than other potential problems. In fact the only studies that have looked at this issue — the Copenhagen Consensus and my own work — show that climate change is, for the foreseeable future, not even a close second in terms of either impacts on human well-being or policies to advance human well-being.  Even if unmitigated climate change is an unmitigated disaster, the analyses that exist indicate that there are other disasters that are larger in scope, more imminent than climate change, and more solvable. As always, we should address first things first, and that does not include mitigation that extends beyond “no regrets.” (For more on this, see my recent paper for Cato.)

    While on this topic, I reproduce below a couple of tables that indicate that malaria and hunger could be worse under 550 ppm than 750 ppm out to the 2080s, assuming one has any confidence in the results out to that date. I also provide the source and affiliation of authors, highlighting the names of authors who were drafters of the IPCC WG II’s latest Summary for Policy Makers, so that Mr. Romm recognizes that they are not “skeptics,” are in good standing with the IPCC, and are associated with institutions that Mr. Romm should find acceptable.

    Source:  Martin Parry,1 Cynthia Rosenzweig,2 and Matthew Livermore3 (2005). “Climate change, global food supply and risk of hunger.” Philosophical Transactions of the Royal Society B 360: 2125-2138.

    1 Hadley Centre, UK Meteorological Office, Fitzroy Road, Exeter EX1 3PB, UK;2 Goddard Institute for Space Studies, 2880 Broadway, New York NY 10025, USA;3 Climatic Research Unit, University of East Anglia, Norwich NR4 7TJ, UK.

    Source: N. W. Arnell,1 M. G. R. Cannell,2 M. Hulme,3 R. S. Kovats,4 J. F. B. Mitchell,5 R. J. Nicholls,6 M. L. Parry,7 M. T. J. Livermore,8 and A. White9 (2002). “The consequences of CO2 stabilisation for the impacts of climate change.” Climatic Change 53: 413-446.

    1Department of Geography, University of Southampton, U.K.; 2NERC Centre for Ecology and Hydrology, Edinburgh, U.K.; 3Tyndall Centre for Climate Change Research, University of East Anglia, U.K.; 4Centre on Globalisation, Environmental Change, and Health, London School of Hygiene and Tropical Medicine, U.K.; 5 Hadley Centre for Climate Prediction and Research, U.K.; 6Flood Hazard Research Centre, Middlesex University, U.K.; 7Jackson Environment Institute, University of East Anglia, U.K.; 8Climatic Research Unit, University of East Anglia, U.K.; 9Department of Mathematics, Heriot-Watt University, U.K.

    (For those who are convinced that nothing is more important than climate change, look at the “no climate change” and “unmitigated” columns, and note that impacts of unmitigated climate change are overshadowed by the impacts of other factors.)

    So if 550 ppm isn’t necessarily better than 750 ppm, how can we be confident 450 will be better than 550, or 750?  Mr. Romm, show us your analysis that indicates 750 ppm is indeed worse than 450 ppm.  Or is this a matter of faith?

    Let’s also look at the specific IPCC quote that Mr. Romm furnishes us with: “As global average temperature increase exceeds about 3.5°C [relative to 1980 to 1999], model projections suggest significant extinctions (40-70% of species assessed) around the globe.” Note that it doesn’t say “predictions” but “model projections.”  That itself speaks volumes. Also note that it doesn’t say “prove” but “suggest.” So Mr. Romm, tell us how good are these model projections? What are the uncertainties associated with the models? Have the model results been verified and validated in the real world using “out of sample” data? And how did they perform?

    In fact, a close reading of the IPCC’s WG II Chapter 4 (on ecosystems) indicates that model projections are fraught with uncertainties, particularly if one considers that: (a) biophysical models use as inputs uncertain results from climate models, which become even more uncertain at the relatively small geographical scales needed to model effects on vegetation, habitat and species, and (2) biophysical models themselves are incomplete and imperfect. See also (Botkin et al. 2007 [pdf]).

    Fourth, Mr. Romm provides us with a figure which purports to show us the “optimum” temperature window.  But this figure doesn’t show us anything of the sort.  Even assuming that the figure is valid — it has no error bars, or any real provenance other than it comes from a source that Romm trusts — all it shows is that temperature may have stayed within a relatively narrow range.  But we don’t know that this range is optimal for humanity.  In fact, combining our knowledge of human history with this figure suggests that humanity is better off when things are warmer!  Things were good during the warmth of the Holocene Optimum when Mesopotamia flourished, they were good during the Medieval Warm Period when Vikings inhabited Greenland, and things have never been better than they are today, even though today is warmer than the norm. (The last point is documented in my book, The Improving State of the World: Why We’re Living Longer, Healthier, More Comfortable Lives on a Cleaner Planet.)  Now, although correlation isn’t causation, what in this figure gives Romm the notion that some additional warmth will necessarily be worse?

    In fact, the Yohe paper that Romm cites suggests that additional warming of up to 2°C, may be on the whole a net benefit to humanity, even though, like others, it seems that study doesn’t fully consider the increases in adaptive capacity and secular technological change, consideration of which would reduce future damages from climate change, effectively increasing the temperature beyond which climate change would result in net losses globally, and reduce the benefit-cost ratio for mitigation.

    Romm’s Factually Challenged Smear of Conservatives

    Joseph Romm asserts:

    Goklany’s core point: the classic disingenuous conservative argument that there are better things to spend money on than climate mitigation. This argument assumes 1) there isn’t enough money to do everything whose benefits exceed their cost and 2) conservatives would actually support things like government-funded efforts to deal with “hunger, malaria, unsafe water, and other poverty related problems.”

    I worry a lot about those problems and think we should be spending a lot more money on them. I have no doubt we would be spending a lot more money on poverty-related problems if it weren’t for conservatives and libertarians in Congress and the White House who consistently oppose such efforts and cut funding for such programs.

    Mr. Romm labels me a conservative — well, if being concerned about human well-being is conservative, I plead guilty, although I’m probably worse: I’m more libertarian than conservative.  No matter, let’s move to more substantive matters.

    First, no one is against spending on actions if benefits exceed costs. Read my original post, Mr. Romm.  It says, “… mitigation (and R&D to expand mitigation options) makes sense so long as its implementation is neither mandatory nor subsidized… ” What this means in plain English is that mitigation makes sense so long as benefits exceed costs (because if they didn’t, one wouldn’t need mandates or subsidies).  In fact, my conclusions are entirely founded on the principle that benefits should exceed costs. I don’t assume that there are better things to spend money on than aggressive mitigation; I prove it in my original post. The problem for Mr. Romm is that most economic analyses of climate change show that benefits don’t exceed costs for aggressive mitigation (see Manzi’s comments, for instance).  And it seems that even the Yohe et al. study he cites also recommends relatively modest mitigation (coupled with adaptation and R&D).

    Second, Mr. Romm smears conservatives with the factually challenged implication that conservatives don’t actually support efforts to deal with hunger, malaria, unsafe water, and other poverty-related problems, and that conservatives and libertarians “consistently oppose such efforts and cut funding for such programs.”

    What are the facts?

    This is what the conservative Washington Post, said in an editorial titled, “Spreading Hope,” on 17 August 2008:

    IN HIS 2003 State of the Union address, President Bush surprised many when he proposed to take the fight against AIDS to Africa. At the time, slowing the spread of the disease seemed quixotic, particularly on a continent where only about 50,000 of the 30 million infected people received antiretroviral treatment. But Mr. Bush’s proposed “work of mercy beyond all current international efforts” has had a profound impact. After five years and $15 billion, 1.7 million people are receiving treatment. Encouraged by the progress, countries in sub-Saharan Africa have spent more of their own money to combat HIV-AIDS. The disease still ravages millions of Africans, but it is no longer an automatic death sentence.

    On July 30, Mr. Bush signed into law a bill that would triple funding for programs that fight HIV-AIDS, malaria and tuberculosis. The bill also repealed a ban on HIV-positive visitors and immigrants to the United States. Unfortunately, key congressional subcommittees have approved funding at levels below those set in the bill.

    And here’s a report from the conservative New York Times:

    Under Mr. Bush, aid to Africa has risen to more than twice the level of any previous administration and more than triple that achieved during the Clinton administration, according to an analysis by the Center for Global Development, a nonprofit research group in Washington.

    The New York Times could also have added that since the U.S. no longer precludes use of its funds for procuring DDT — the most cost-effective malarial control in many situations — these funds also go farther.

    One of the secrets about malaria and hunger is that while many have been obsessing over climate change supposedly in part because it might contribute to hunger and malaria (among other things), conservatives and libertarians have been in the forefront of battling these very problems worldwide. They played a big part in ensuring that environmental groups did not use the Stockholm Convention to preclude the use of DDT to combat public health problems, and in getting the World Health Organization to actively support its use for malaria control. Similarly, conservatives and libertarians are pushing to remove the stigma on biotechnology and genetically modified crops that retards progress on better feeding the world more fully and more nutritiously with fewer chemicals and pesticides and less disturbance of soil (which, in fact, would reduce greenhouse gas emissions).  Moreover, such crops can be the basis for adapting agriculture to climate change, which is necessary if we are to contain hunger in the future. (See The Improving State of the World: Why We’re Living Longer, Healthier, More Comfortable Lives on a Cleaner Planet, Chapter 9).

    Also, I find curious Mr. Romm’s suggestion that programs to deal with hunger, malaria, and other poverty-related problems should have to be “government-funded.” Why is it necessary that these efforts be government-funded?  In fact, the most successful poverty reduction programs have involved not government funding, but government loosening its grip on the economy, that is, less government intrusion. Mainly because of economic growth, the proportion of the developing world’s population living in absolute poverty (that is, subsisting on a dollar a day), declined from 52% in 1981 to 26% in 2005. (See this World Bank report.)

    China is the prime example. Despite population growth, there were 600 million fewer Chinese living in absolute poverty (based on consumption) in 2005 compared to 1981, largely due to double-digit economic growth year after year, unleashed by the partial liberalization of its economy starting in 1979. And, yes, its economic growth was fueled by fossil fuels. By contrast, despite spending over $2 trillion in 5 decades, aid programs have much less to show in terms of poverty reduction — or its ancillary benefits, e.g., reductions in hunger, disease, better health care and education, and greater adaptive capacity to deal with climate change and natural disasters — than does fossil fuel-powered economic development.

    It does no good to “worry a lot about these problems and think we should be spending a lot more money on them” so long as one insists on grand but ineffective gestures, such as aggressive mitigation and government-funded programs, when cheaper, quicker, and more effective remedies are available and can be implemented immediately even as we prepare for the future.

    Common Ground

    I’d like to thank Cato Unbound for convening such an extended exchange of views on how to deal with climate change, and for giving me a platform for expressing my views on this subject.  I’d also like to thank Joseph Romm, Indur Goklany, Michael Shellenberger, and Ted Nordhaus for their extensive efforts in considering and responding to my essay and subsequent comments.  It’s always inspiring to me to see people who’ve devoted so much time, work, and intellect to analyzing hard problems.

    Mr. Romm and I in particular have disagreed quite directly about the likely impacts of carbon dioxide emissions, and I’ll just refer interested readers to the series of detailed exchanges between us, and ask them to draw their own conclusions.  Rather than use this closing comment to respond to the last round of replies by each contributor (it seems kind of unfair that I would get both “first ups” and “last licks,” as we used to say in Little League), I’d like to try to establish what I think is common ground between us.  I think that vigorous but respectful and fact-based disagreement is almost always a precondition for practical progress on complicated issues, but that ultimately some consensus needs to be achieved to get anything done.

    It seems to me that all contributors believe that anthropogenic global warming is real and poses a serious risk.  We all agree that an R&D program of the type that I have proposed is a component of a solution, and I hope that we all can get behind this idea.  I think that we would all support adaptation to weather problems that may arise as a wise investment of resources.  Most adaptation measures have the advantage that, in comparison with R&D or mitigation efforts, they can be executed in fairly short order and only in response to problems as they become manifest, and hence would likely have very attractive cost-benefit ratios.  Finally, I think that we would all agree that the ongoing efforts to analyze physical and economic trade-offs involved in various proposals through the IPCC and similar bodies are valuable and should be supported.   (In fact, I would like to see such processes incorporate case-by-case analyses of the kinds of incremental R&D/technology-deployment ideas that Messrs. Shellenberger and Nordhaus have proposed).  Improved science, along with increased structure and rigor in the debate of its implications, should enable further progress.