News
Architecture and Hardware News

Calculating the Future

Climate researchers have no shortage of scientific issues on which to expend computer power. The biggest problem is choosing which one to tackle first.
Posted
  1. Introduction
  2. Atmospheric Challenges
  3. Programming Problems
  4. Author
  5. Footnotes
  6. Figures
climate change ocean model and atmosphere model
This U.S. National Oceanic and Atmospheric Administration climate change model that couples a 25 km-resolution ocean model with a 100 km-resolution atmosphere model.

If you’re using a computer to solve a scientific problem, it goes without saying that additional computer power will help answer the problem faster and more accurately. Or will it? For the community of researchers who use vast computer models to simulate Earth’s climate in all its glorious intricacy, greater computational capacity is always welcome, but choosing where to apply that power can be contentious. Is it better to compute existing models in finer detail, or to make the models bigger by adding more scientific content? There’s no single best answer to that conundrum, and in practice the research community pursues as wide a variety of goals as it can, in the hope that a consensus will eventually emerge.

Today’s so-called General Circulation Models (GCMs) include interlinked components that attempt to capture the behavior of atmosphere, oceans, sea ice, and land surface in determining Earth’s climate. In computational terms, a GCM is essentially an enormous and intricately interlinked collection of ordinary and partial differential equations that calculate air and ocean currents and their associated heat flows; the absorption of the sun’s heat (which depends on cloud cover and the amount of snow and ice covering the planet’s surface, among other things); the radiation of heat from land and sea ice back into the atmosphere; humidity and precipitation; and a great deal more. Typically, these models cover the planet’s surface by calculating at grid points spaced approximately 100 kilometers apart, and divide the atmosphere, up to a height of some 15 kilometers, into perhaps 20 layers. From a global perspective, with Earth’s total surface area amounting to just more than a half-billion square kilometers, that’s a lot of grid points, but it takes no scientific expertise to understand that weather conditions can vary significantly across hundred-kilometer distances. As a result, many medium-scale phenomena in current GCMs cannot be calculated directly but must be dealt with by “parameterization,” meaning that important aspects of small-scale physics are in essence approximated and averaged over grid cells.

An obvious use of greater computer power is to decrease the distance between grid points. That’s particularly valuable in ocean modeling, says Ben Kirtman of the Rosenstiel School of Marine and Atmospheric Science at the University of Miami, because calculating on a grid spacing of a few kilometers would directly capture important heat and current flows, without parameterization. Kirtman is working with a project recently funded by the National Science Foundation to apply petascale computing capacity—1015 floating point operations per second—to the analysis of ocean-atmosphere interactions. He cites the example of tropical instability waves in the eastern Pacific Ocean as a medium-scale marine phenomenon that climate scientists “originally thought the atmosphere didn’t care about.” Higher-resolution calculations show, however, that these instability waves, along with mesoscale ocean eddies measuring 10 kilometers or so across, profoundly influence not only how heat mixes both horizontally and vertically within the ocean, but also how heat is exchanged between ocean and atmosphere. The eastern Pacific Ocean-atmosphere interaction, Kirtman explains, in turn feeds into the year-to-year evolution of the well-known El Nino-Southern Oscillation, demonstrating that regional calculations on the kilometer scale are crucial to a better understanding of globally significant phenomena.

Back to Top

Atmospheric Challenges

The atmosphere presents more difficult problems. Different GCMs are often compared in terms of the average global temperature increase they predict for a doubling of atmospheric carbon dioxide. That figure ranges from approximately 1.5° C to 4.5° C, and much of the variation between models stems from the different ways they parameterize fine-scaled atmospheric features such as convection and cloud cover. Higher-resolution calculations will do much to clarify convective and turbulent flows, says Jerry Meehl of the Climate and Global Dynamics Division at the National Center for Atmospheric Research (NCAR) in Boulder, CO, but clouds are more complicated. Clouds reflect sunlight from above but trap heat rising from below, so their net effect on climate depends on details of cloud composition and structure that current models struggle to depict. Typically, models allocate some percentage of various cloud types to each grid cell, and allow some randomized overlap of cloud layers at different altitudes. But the biggest obstacle to more accurate modeling, says Meehl, has been a lack of detailed observations of the way clouds literally stack up in the atmosphere. In this case, increased computer power will only be useful if it is coupled to better physical data on cloud structure and properties that can be used to refine cloud simulations. “The cloud community now is as excited as I’ve ever seen them,” Meehl says, because satellites are beginning to provide just the type of detailed 3D data that modelers need.

The steadily increasing resolution of GCMs is blurring the already fuzzy distinction between weather and climate. Researchers are beginning to calculate models with 50-kilometer resolution over periods of decades, enabling them to see how climate change might affect the frequency and intensity of extreme storms or the statistics of droughts. Such information, rather than the more abstract concept of global average temperature, starkly conveys the tangible consequences of global warming.

In addition to using computing power to calculate on an ever-finer scale, climate researchers can always think of more science to put into their simulations. Historically, the growth of computational capacity allowed researchers to integrate previously separate models of ocean, atmosphere, sea ice, and land, and that trend continues on a number of fronts. At the moment, for example, atmospheric carbon dioxide concentration is applied to climate models as an external parameter, derived from the work of scientists who add up emissions from tailpipes and smokestacks and, taking into account the natural processes that absorb and release the gas, try to estimate how much CO2 will be in the atmosphere 10, 20, or more years from now. But this approach misses all types of crucial feedbacks. Changing temperatures of the oceans affects how well they hold dissolved CO2, while changes in the world’s vegetation cover, due to a warming climate, influence the amount of carbon that ends up in the atmosphere rather than being taken up by biomass. Climate modelers are beginning to integrate parts of this complex network of feedbacks into GCMs, so that ultimately they will be able to input human CO2 emissions directly into the models, and allow the computer to figure out where it all ends up—and how that disposition changes in a changing climate.

Climate researchers, then, are forced to make compromises when deciding what to do with more computing power. Modelers have consistently aimed to get five or 10 calculated climate years per day of computing time, says Meehl, and keeping to that figure sets a practical limit on the increase in resolution or scientific complexity that additional computing power will buy. To get climate change predictions right, Meehl says, new science must eventually be included, because inadequate treatment of various feedbacks is “in large part what contributes to the disagreements among models.” On the other hand, Kirtman says that he prefers to focus on refining what’s already in the models, because “we can’t even get the clouds right and until we do that we can’t usefully add in the other feedbacks.”

Back to Top

Programming Problems

Getting everything right is still years away. Achieving global one-kilometer resolution with current GCMs—while adding no new science—is a computational task of exascale proportions, requiring the performance of approximately 1018 floating point operations per second. Right now, climate modelers are beginning to grapple with petascale systems, built from tens of thousands of processors. But making good use of such a system is no easy matter as the evolution of efficient programming techniques has not kept pace with the growth of computing power, says Dave Bader of the Program for Climate Model Diagnosis and Intercomparison at Lawrence Livermore National Laboratory.

What makes coding a GCM a particular challenge, Bader explains, is the huge amount of information that must be continually and rapidly transferred among the model’s numerous components. On top of that, climate models generate large amounts of output data, and getting those results out of the program and into displays that users can make sense of is also challenging.

If the limiting factor in running a climate model on a multiprocessor system is inefficient communication of information within the program, then the amount of processing power dedicated to solving equations falls and the model fails to take advantage of the raw processing power available. “The programming model we use [now] is not viable anymore in the next couple of generations of computers,” says Bader. The handful of vendors in the supercomputer market—IBM, Cray, and a few others—don’t devote as much effort as they used to in developing languages and compilers that serve scientific users, he adds, so that responsibility for such things falls increasingly on the shoulders of researchers at various laboratories working in partnership with the vendors.


Is it better to compute existing models in finer detail, or to make the models bigger by adding more scientific content?


The drive for programming efficiency has changed the way climate scientists work. Meehl says that when he started in the 1970s, scientists would “fiddle with Fortran code, submit it to the machine, and it just ran. We didn’t have to think about it a whole lot.” Now, though, the team behind NCAR’s Community Climate System Model (CCSM), a state-of-the-art GCM used by researchers across the U.S., includes a working group of software engineers dedicated to ensuring the code runs reliably and efficiently.

Still, there’s room for innovation. “I actually do a fair amount of my own code work, by the seat of my pants,” says Kirtman. His programming may be inefficient and prone to failure, but that’s not important while he’s developing it, Kirtman says. To get his new work on ocean-atmosphere interactions incorporated into the CCSM, he turns it over to software engineers who transform his pilot program into a robust piece of modeling that any researcher can download and use. That’s something he couldn’t do, Kirtman says, and the net result is “I get a lot of feedback from people who are trying to apply my methods to their problem—that’s really powerful to me.”

Back to Top

Back to Top

Back to Top

Figures

UF1 Figure. A U.S. National Oceanic and Atmospheric Administration climate change model that couples a 25 km-resolution ocean model with a 100 km-resolution atmosphere model.

UF2 Figure. A climate change model of Earth with a quasiuniform but nonorthogonal quadrilateral grid.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More