The most comprehensive modeling yet carried out on the likelihood of how much hotter the Earth's climate will get in this century shows that without rapid and massive action, the problem will be about twice as severe as previously estimated six years ago - and could be even worse than that.
The study uses the MIT Integrated Global Systems Model, a detailed computer simulation of global economic activity and climate processes that has been developed and refined by the Joint Program on the Science and Policy of Global Change since the early 1990s. The new research involved 400 runs of the model with each run using slight variations in input parameters, selected so that each run has about an equal probability of being correct based on present observations and knowledge. Other research groups have estimated the probabilities of various outcomes, based on variations in the physical response of the climate system itself. But the MIT model is the only one that interactively includes detailed treatment of possible changes in human activities as well - such as the degree of economic growth, with its associated energy use, in different countries.
Study co-author Ronald Prinn, the co-director of the Joint Program and director of MIT's Center for Global Change Science, says that, regarding global warming, it is important "to base our opinions and policies on the peer-reviewed science," he says. And in the peer-reviewed literature, the MIT model, unlike any other, looks in great detail at the effects of economic activity coupled with the effects of atmospheric, oceanic and biological systems. "In that sense, our work is unique," he says.
The new projections, published this month in the American Meteorological Society's Journal of Climate, indicate a median probability of surface warming of 5.2 degrees Celsius by 2100, with a 90% probability range of 3.5 to 7.4 degrees. This can be compared to a median projected increase in the 2003 study of just 2.4 degrees. The difference is caused by several factors rather than any single big change. Among these are improved economic modeling and newer economic data showing less chance of low emissions than had been projected in the earlier scenarios. Other changes include accounting for the past masking of underlying warming by the cooling induced by 20th century volcanoes, and for emissions of soot, which can add to the warming effect. In addition, measurements of deep ocean temperature rises, which enable estimates of how fast heat and carbon dioxide are removed from the atmosphere and transferred to the ocean depths, imply lower transfer rates than previously estimated.
Prinn says these and a variety of other changes based on new measurements and new analyses changed the odds on what could be expected in this century in the "no policy" scenarios - that is, where there are no policies in place that specifically induce reductions in greenhouse gas emissions. Overall, the changes "unfortunately largely summed up all in the same direction," he says. "Overall, they stacked up so they caused more projected global warming."
While the outcomes in the "no policy" projections now look much worse than before, there is less change from previous work in the projected outcomes if strong policies are put in place now to drastically curb greenhouse gas emissions. Without action, "there is significantly more risk than we previously estimated," Prinn says. "This increases the urgency for significant policy action."
To illustrate the range of probabilities revealed by the 400 simulations, Prinn and the team produced a "roulette wheel" that reflects the latest relative odds of various levels of temperature rise. The wheel provides a very graphic representation of just how serious the potential climate impacts are.
"There's no way the world can or should take these risks," Prinn says. And the odds indicated by this modeling may actually understate the problem, because the model does not fully incorporate other positive feedbacks that can occur, for example, if increased temperatures caused a large-scale melting of permafrost in arctic regions and subsequent release of large quantities of methane, a very potent greenhouse gas. Including that feedback "is just going to make it worse," Prinn says.
The lead author of the paper describing the new projections is Andrei Sokolov, research scientist in the Joint Program. Other authors, besides Sokolov and Prinn, include Peter H. Stone, Chris E. Forest, Sergey Paltsev, Adam Schlosser, Stephanie Dutkiewicz, John Reilly, Marcus Sarofim, Chien Wang and Henry D. Jacoby, all of the MIT Joint Program on the Science and Policy of Global Change, as well as Mort Webster of MIT's Engineering Systems Division and D. Kicklighter, B. Felzer and J. Melillo of the Marine Biological Laboratory at Woods Hole.
Prinn stresses that the computer models are built to match the known conditions, processes and past history of the relevant human and natural systems, and the researchers are therefore dependent on the accuracy of this current knowledge. Beyond this, "we do the research, and let the results fall where they may," he says. Since there are so many uncertainties, especially with regard to what human beings will choose to do and how large the climate response will be, "we don't pretend we can do it accurately. Instead, we do these 400 runs and look at the spread of the odds."
Because vehicles last for years, and buildings and powerplants last for decades, it is essential to start making major changes through adoption of significant national and international policies as soon as possible, Prinn says. "The least-cost option to lower the risk is to start now and steadily transform the global energy system over the coming decades to low or zero greenhouse gas-emitting technologies."
This work was supported in part by grants from the Office of Science of the U.S. Dept. of Energy, and by the industrial and foundation sponsors of the MIT Joint Program on the Science and Policy of Global Change.
A version of this article appeared in MIT Tech Talk on May 20, 2009 (download PDF).