The World’s Hottest Super-Models

How scientists are using supercomputers to model climate change.

March 22, 2007
David Erickson (left) shows colleague José Hernandez a climate model map. [CREDIT: OAK RIDGE NATIONAL LABORATORY]
David Erickson (left) shows colleague José Hernandez a climate model map. [CREDIT: OAK RIDGE NATIONAL LABORATORY]

Just south of Tokyo, in Yokohama, a hangar-sized building hums and whirs as its occupants work endlessly, day and night. They are busy with the daunting task of projecting the next 300 years of Earth’s climate. But not a single human being occupies this colossal, buzzing space. It is instead the home of a supercomputer and its many giant processors: climate scientists’ very own version of a cloudy crystal ball.

A sobering projection of Earth’s future, released by the Intergovernmental Panel on Climate Change in February, owes its bleakness to calculations processed by Yokohama’s supercomputer – called the Earth Simulator – along with 23 other such mammoth machines around the world.

Rising temperatures and sea levels, extreme heat waves and heavy precipitation are all part of the IPCC’s projections for the next century. The panel’s latest report is the fourth and most severe assessment of global warming’s likely effects that has been issued since its inception in 1988.

To make projections about future climate, scientists use supercomputers to process models that use physics and thermodynamics-based equations covering everything from humidity in the atmosphere to sea ice formation. Groups from 11 countries generated climate simulations that were used to make the assessment’s projections, including its prediction that the planet’s average surface temperature is very likely to increase between 3.2 and 7.2 degrees Fahrenheit by the mid to late 21st century.

Climate models aren’t perfect predictors, but they “are the only thing we’ve got,” said Gerald Meehl, a climate modeler at the National Center for Atmospheric Research in Boulder, Colo. and a lead author of the IPCC report. “They’re the only way we can come up with any kind of idea of what could happen.”

One way to understand how climate models work is to look at the way that meteorologists predict the ten-day weather forecast.

To create a weather model used for the forecast, scientists divide the surface of the Earth into thousands of grid boxes using latitude and longitude lines, explained Jagadish Shukla, a climatologist at George Mason University in Virginia and one of the 800 authors of the IPCC report. Each box has approximately 30 vertical layers delineating the ocean, land mass and atmosphere. At every horizontal and vertical intersection of the grid, scientists write several equations that calculate how variables such as temperature, air velocity, humidity and pressure will change over a short period of time – often ten minutes. Once they have the data for ten minutes into the future, they use that data to calculate the next ten minutes, and the next ten, until they have data for ten days.

“Few people realize that millions of equations must be solved to provide a simple weather forecast,” Shukla said.

While the level of complexity required to make weather forecasts is impressive, climate models require millions of additional calculations because they cover longer periods of time and include many more slowly changing components of the Earth, such as how much moisture is in the soil, dynamic ocean patterns such as El Niño and the amount of greenhouse gases in the atmosphere. To run a 100-year climate model, supercomputers may labor for an entire month.

Yokohama’s Earth Simulator can run climate models with grid points spaced as few as 20 kilometers apart. When constructed in 2002, it was the world’s most powerful computer; now, faster and larger supercomputers exist in the United States, but the Earth Simulator remains “the one center where most of its work is dedicated to climate research,” Meehl said. “No other place in the world is working at its level.”

To ensure that the climate simulations for the future are fairly precise, climatologists use data from the start of the 20th century and simulate that century’s climate. “Our hypothesis is if the model was good at projecting the climate in the last century, the model will be good for the next 100 years,” said Shukla of George Mason University.

“Climate model-based studies are more like ‘what if’ stories – for example, what’s the climate likely to do if our carbon dioxide emissions change in such and such a way?’” explained Rich Wolfson, a professor of climate change at Middlebury College in Vermont. “Then they become forecasts of what will happen.”

Other Articles In Our Climate Change Series:

The Other Greenhouse Gas: An often ignored greenhouse gas makes predicting climate even more uncertain.

Polar Regions Lose Their Shine
: Melting snow and ice allow global warming to gain more ground.

Taking the Climate’s Temperature: How scientists measure the sensitivity of our climate.

About the Author


1 Comment

Christopher Modruson says:

PLEASE, do more to get this information out to the people through major media outlets.

Leave a Reply

Your email address will not be published. Required fields are marked *


The Scienceline Newsletter

Sign up for regular updates.