News
Architecture and Hardware News

Cosmic Simulations

With the help of supercomputers, scientists are now able to create models of large-scale astronomical events.
Posted
  1. Introduction
  2. Smaller Simulations
  3. Further Reading
  4. Author
  5. Figures
visualization from the Bolshoi simulation
A visualization from the Bolshoi simulation depicting the evolution of gas density in the resimulated 007 cluster.

If you are going to build a synthetic universe in a computer and watch it evolve over billions of years, you are going to need a mighty powerful computer, one that will literally go where no computer has gone before. Last fall, astronomers with the University of California High-Performance AstroComputing Center announced they had successfully completed such a model, which they called the Bolshoi simulation. Bolshoi, which is Russian for “great” or “grand,” is an apt word choice. The simulation used six million CPU hours on the Pleiades supercomputer at the U.S. National Aeronautic and Space Administration’s (NASA’s) Ames Research Center, which, as of June 2012, was rated as the world’s 11th most powerful computer on the TOP500 list.

Today, there is hardly a field of science that has not been propelled into new territory by supercomputers. You find them in studies as diverse as genome analysis, climate modeling, and seismic wave propagation. In the last five years, supercomputer technology has advanced so significantly in speed and processing capacity that many researchers no longer refer to these powerful mainframes as supercomputers, but rather HPCs, or high-performance computers. These rarefied machines are designed to process vast amounts of scientific data and run simulations based on that data.

The Pleiades is a marvel of computer technology. The system architecture consists of 112,896 cores housed in 185 refrigerator-sized racks. It can run at a theoretical peak performance of 1.34 petaflops and has a total memory of 191 terabytes. NASA uses the Pleiades for its most demanding modeling and simulation projects in aeronautics, but its applications obviously do not stop there. To produce the Bolshoi simulation, the Adaptive Refinement Tree (ART) algorithm was run on Pleiades. Rather than compute interactions between pairs of particles, the ART algorithm subdivides space into cubical cells and calculates interactions between cells. This allows for more efficient calculations of gravitational interactions among billions of mass particles, making it the perfect tool for recreating an unfolding cosmos.

Bolshoi’s purpose was to do that and more. It would model not just how the visible universe of stars, gas, and dust evolved, but also how the vast majority of the invisible universe, which is composed of dark matter, evolved. Dark matter is a crucial component of the simulation because, although it cannot yet be directly detected (it can only be inferred from its gravitational effects on normal matter), galaxies are thought to have formed within huge “cocoons” of dark matter, called dark matter halos.

Astronomers actually ran two Bolshoi simulations on Pleiades: the Bolshoi and the BigBolshoi. The Bolshoi computed the evolution of a volume of space measuring about one billion light-years on a side containing more than one million galaxies. The simulation begins about 24 million years after the big bang, which occurred 13.7 billion years ago, and follows the evolution of 8.6 billion dark-matter particles, each with an assigned mass of 200 million times that of the sun, to the present day. Logistically, the simulation required 13,824 cores and a cumulative 13 terabytes of RAM. In all, 600,000 files were saved, filling 100 terabytes of disk space. During the Bolshoi simulation’s run, 180 “snapshots” were made showing the evolutionary process at different times. These visualizations will allow astrophysicists to further analyze how dark matter halos, galaxies, and clusters of galaxies coalesced and evolved.

The BigBolshoi simulation was of a lower resolution but covered a volume of four billion light-years, 64 times larger than the Bolshoi model. Its purpose was to predict the properties and distribution of galaxy clusters and superclusters throughout this volume of space.

Any simulation that strives to reflect real processes in nature requires a significant amount of observational input data and a well-founded theory that explains what is observed. In the former case, the Bolshoi simulation is based on precise measurements of the vestigial all-sky afterglow of the big bang, which astronomers call cosmic microwave background radiation. The measurements, made over several years using NASA’s Wilkinson Microwave Anisotropy Probe spacecraft, revealed that the background radiation is not perfectly uniform, but exhibits tiny variations, regions that are slightly more or less dense by one part in 100,000. These fluctuations correspond to non-uniformities in the otherwise uniform distribution of matter in the very early universe, and are essentially the seedlings from which emerged all of the galaxies observed in the universe today.


The 180 “snapshots” taken during the Bolshoi simulation will allow astrophysicists to analyze how dark matter halos, galaxies, and clusters of galaxies coalesced and evolved.


Just how the universe went from a nearly smooth initial state to one so full of complex structure has long puzzled cosmologists, but most agree the best explanation is the Lambda Cold Dark Matter theory, or ΛCDM. Once referred to as just the Cold Dark Matter theory, it has since been augmented to included dark energy (Λ), a mysterious counteractive force to gravity that has been invoked to explain the accelerating expansion of the universe. The theory makes specific predictions for how structure in the universe grows hierarchically as smaller objects merge into bigger ones. Because ΛCDM explains much of what is observed, the Bolshoi simulation drew upon this model as its theoretical framework.

The results of both simulations largely confirm cosmologists’ assumptions about the formation of large-scale structure, but one discrepancy needs to be addressed.

“Our analysis of the original Bolshoi simulation showed that dark matter halos that could host early galaxies are much less abundant than earlier estimates indicated,” says Joel Primack, director of the University of California High-Performance AstroComputing Center and coauthor of the paper announcing the results of the Bolshoi simulation. The previous estimates predicted the number of halos in the early universe should be 10 times greater than is seen in the Bolshoi simulation. The difference, says Primack, is significant. “It remains to be seen whether this is just observational incompleteness or a potentially serious problem for the standard Lambda Cold Dark Matter theory.”

Back to Top

Smaller Simulations

Two other recent computer simulations also have broken new astrophysical ground, but at smaller scales. One is the first realistic simulation, at galactic scales, of how the Milky Way was formed. The simulation, named Eris, shows the origin of the Milky Way beginning one million years after the big bang and traces its evolution to present time. It was produced by a research group run by Lucio Mayer, an astrophysicist at the University of Zurich, and Piero Madau, an astronomer at the University of California, Santa Cruz.

Just exactly how spiral galaxies like ours form has been the subject of contentious debate for decades (hence christening the simulation “Eris,” the Greek goddess of strife and discord). Previous simulations resulted in galaxies that were either too small or dense, did not have an extended disk of gas and stars, or exhibited too many stars in the central region. Eris, however, achieved the proper balance. Once again, the Pleiades computer was brought to bear, entailing 1.4 million processor hours. Supporting simulations were performed using the Cray XT5 Monte Rosa computer at Zurich’s Swiss National Supercomputing Center at the Swiss Federal Institute of Technology. (The Cray XT5 has since been upgraded to a 400-teraflop Cray XE6.) The simulation modeled the formation of a galaxy with 790 billion solar masses comprised of 18.6 million particles, from which dark matter, gas, and stars form. The results are another confirmation of the Cold Dark Matter theory.

“In this theory, galaxies are the outcome of the gravitational growth of tiny quantum density fluctuations present shortly after the big bang,” says Madau. “The ordinary matter that forms stars and planets has fallen into the gravitational wells created by large clumps of dark matter, giving rise to galaxies in the centers of dark matter halos.”

Even smaller, on stellar scales, astronomers have employed HPCs to render a clearer picture of why a class of compact, dense objects called neutron stars often move through space at very high velocities, in some cases 1,000 kilometers or more per second. (Most stars have typical space velocities of a few tens of kilometers per second.) A clue may be found in how neutron stars are created. These objects, in which protons and electrons have been gravitationally compressed into neutrons, form from the collapsing cores of massive stars just before they explode as supernovae. Most astronomers have long been convinced the explosion itself somehow gives the neutron star its high-velocity “kick.” To explore this possibility, Adam Burrows, an astrophysicist at Princeton University, created a three-dimensional animation of conditions throughout the star during the explosion. He found that the star does not explode symmetrically, but that the explosion rips through the star asymmetrically. The hydrodynamic recoil from such an explosion is more than sufficient to hurl the neutron star off into space.

“This is a straightforward consequence of momentum conservation when things aren’t spherical,” says Burrows. “A lot of exotic mechanisms have been proposed, but this simplest of origins seems quite natural.”

The simulation was run on the Cray XT5 Kraken supercomputer, which is the 21st most-powerful computer in the world, and is housed at Oak Ridge National Laboratory.

None of these studies would have been possible without HPCs. A regular personal computer, for instance, would have required 570 years to make the same calculations to reproduce the Eris simulation. Still, even as computer performance improves, Primack sees new challenges ahead. For example, if the observational and computational datasets expand exponentially while the speed of data transmission expands arithmetically, methodologies for analysis will need to change. “Instead of bringing data to the desktop, we will increasingly have to bring our algorithms to the data,” he says.

Another challenge, says Primack, is the energy cost of computation. “The Department of Energy, which currently has the fastest U.S. supercomputers on the TOP500 list, is not willing to have its supercomputer centers use much more than the 10 megawatts that they currently do. It will be a huge challenge to go from 10 petaflops to one exaflop without increasing the energy consumption by more than a small factor. We don’t know how to do this yet.”

Despite such challenges, a new era of HPC is dawning. Sequoia, an IBM BlueGene/Q system at Lawrence Livermore National Laboratory, has attained 16 petaflops, and four other HPCs in the U.S. are expected to be in the 10-petaflops range soon. IBM’s Mira at the Argonne National Laboratory just attained 10 petaflops, and the Blue Waters project is now online at the National Center for Supercomputer Applications in Urbana-Champaign, IL. Titan should become operational at the Oak Ridge National Laboratory later this year, and Stampede at The University of Texas at Austin is expected to be up and running in January 2013. These and other HPCs promise to reveal new and transformative insights into the world and the universe from the smallest scales to the largest. Their simulations will probe levels of complexity we can only imagine, taking us where no one has gone—or could possibly go—before.

Back to Top

Further Reading

Guedes, J., Callegari, S., Madau, P., and Mayer, L.
Forming realistic late-type spirals in a ΛCDM universe: The Eris simulation, The Astrophysical Journal 742, 2, Dec. 2011.

Nordhaus, J., Brandt, T. D., Burrows, A., Almgren, A.
The hydrodynamic origin of neutron kick stars, http://arxiv.org/abs/1112.3342.

Prada, F., Klypin, A., Cuesta, A., Betancort-Rijo, J., and Primack, J.
Halo concentrations in the standard ΛCDM cosmology, http://arxiv.org/pdf/1104.5130.pdf.

Rantsiou, E., Burrows, A., Nordhaus, J., and Almgren, A.
Induced rotation in 3D simulations of Ccre collapse supernovae: Implications for pulsar spins, The Astrophysical Journal 732, 1, May 2011.

University of California High-Performance AstroComputing Center
Bolshoi videos, http://hipacc.ucsc.edu/Bolshoi/Movies.html.

Back to Top

Back to Top

Figures

UF1 Figure. A visualization from the Bolshoi simulation depicting the evolution of gas density in the resimulated 007 cluster.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More