News
Artificial Intelligence and Machine Learning News

Modeling the Astronomical

Advances in computer technology have changed the way astronomers see and study the universe.
Posted
  1. Introduction
  2. Smashing Galaxies
  3. Searching for Extrasolar Planets
  4. Citizen Science
  5. Author
  6. Footnotes
  7. Figures
Gliese 667 C and a pair of stars
Gliese 667 C and a pair of stars that were discovered by the High Accuracy Radial Velocity Planet searcher, the spectrograph for ESO's 3.6-meter telescope at the La Silla Observatory in Chile.

Astronomers once learned everything they knew about the universe from telescopes, spectroscopes, and other optical instruments. Today, one of the most important items in an astronomer’s toolbox is the computer. Galaxy and star formation, supernova explosions, even the origins of the universe—all can be modeled and manipulated to an incredible degree using computers, both powerful mainframes and desktop models. What follows are just a few examples of computer applications that are helping scientists paint bold, new pictures of the cosmos.

Back to Top

Smashing Galaxies

In the early years of the 20th century, astronomers believed that collisions between galaxies were so rare that few, if any, examples might ever be observed. Even in clusters, galaxies are small compared to the vast distances between them, therefore an encounter was deemed unlikely. This mind set quickly shifted, however, once more extensive photographic surveys of the sky revealed that some galaxies appeared to be peculiar amalgams of both spiral-shaped galaxies and spheroidal galaxies. Most of these disturbed specimens could only be explained as products of mergers with other galaxies.

Astronomer and mathematician Alar Toomre, of the Massachusetts Institute of Technology, and his brother Jüri, of the University of Colorado, Boulder, conducted some of the first computer simulations of galaxy interactions in the early 1970s. They designed numerical programs that could determine the trajectories of a number of test particles, or N-point masses, in which only gravitational interactions are considered. Such programs were run on state-of-the-art mainframes and required between five and 10 minutes to run a collision scenario with only 350 particles. By the late 1980s, an IBM desktop computer equipped with compiled BASIC could perform the same feat.

The last decade of the 20th century saw significantly more sophisticated computer models that could simulate galaxy interactions over a billion years of time. Initial results of these simulations revealed an unexpected finding. If two spiral galaxies of nearly equal mass passed each other with a low enough velocity, they would essentially fall into each other and, over hundreds of millions of years, merge into a single, massive elliptical galaxy. With this insight, astronomers were not only able to forge an evolutionary link between galaxies of various forms, but also at different epochs in the history of the universe.

The most powerful computers today—supercomputers—have revolutionized the field of galaxy interactions. In one recent example, astronomer Fabio Governato of the University of Washington, Seattle and colleagues used several U.S. supercomputing facilities to solve a long-standing mystery in galaxy formation theory: why most galaxies do not have more stars at their cores. This predicament is most pronounced in a class of diminutive galaxies called dwarf galaxies, the most common type of galaxy in the neighborhood of the Milky Way. Governato’s computer simulations showed that supernova explosions in and around the core of a developing dwarf galaxy generate enormous winds that sweep huge amounts of gas away from the center, preventing millions of new stars from forming there.

Governato’s project consumed about one million CPU hours, the equivalent of 100 years on a single desktop computer. “This project would have not been possible just a few years ago,” he says, “and, of course, completely unthinkable in the 1980s.” The enormous growth in computing power, Governato adds, makes the early results obtained by the Toomre brothers using rudimentary computers “even more amazing as their results still hold true.”

Back to Top

Searching for Extrasolar Planets

Since 1995, astronomers have discovered more than 400 planets orbiting other stars. Because stars are about a million times brighter than typical planetary bodies, most have been detected using indirect means. One of the most successful of these employs a highly sensitive spectrograph that can detect minute periodic changes in the star’s velocity toward and away from Earth—motions caused by changes in the system’s center of gravity as the planet orbits the star. These back-and-forth velocity shifts, however, are incredibly slow, amounting to as little as a few meters per second. Still another search method looks for slight wobbles in the star’s position in space, a motion also induced by an orbiting body. The extent of the wobble allows astronomers to determine the exo planet’s mass. The positional change is very fine—within 20 milliarcseconds or so, which is about the diameter of a golf ball seen at the distance of the Moon. Such fine-tolerance measurements as these would be impossible without computers.

“The positions of hundreds of lines in spectra are obtained by a truly computer-intensive process of modeling their shapes and noting changes from day to day,” says astronomer G. Fritz Benedict of the University of Texas at Austin, who led a team that made the first astrometrically determined mass of an extrasolar planet in 2002.

Computers are also important in helping astronomers understand the nature of exoplanets after they have been discovered. In February 2009, a team of astronomers associated with a planet-hunting space mission called COROT (for COnvection ROtation and planetary Transits), led by the French space agency CNES, announced that they had detected a planet only 1.7 times Earth’s diameter. But the planet, called CoRoT-7b, is hardly Earthlike. It lies only 1.5 million miles from its host star and completes one orbit in a little over 20 hours. Observations indicate the planet is tidally locked, so that it keeps one hemisphere perpetually turned toward its sun. Being so close to its host star, the dayside temperature is some 2600° K, or 4220° F—hot enough to vaporize rocks.


The last decade of the 20th century saw significantly more sophisticated computer models that could simulate galaxy interactions over a billion years of time.


Astronomer Bruce Fegley, Jr. of Washington University and research assistant Laura Schaefer used a computer program called MAGMA to model CoRoT-7b’s atmospheric structure. MAGMA was originally developed to study the vaporization of silicates on Mercury’s surface, but has since been used, among other things, to model the vaporization of elements during a meteor’s fiery passage through Earth’s upper atmosphere.

Fegley and Schaefer’s results showed that CoRoT-7b’s atmosphere is virtually free of volatile compounds, such as water, nitrogen, and carbon dioxide, probably because they were boiled off long ago. But their models indicate plenty of compounds normally found in a terrestrial planet’s crust, such as potassium, silicon monoxide, and oxygen. They concluded that the atmosphere of CoRoT-7b has been altered by the vaporization of surface silicates into clouds. Moreover, these clouds may “rain out” silicates, effectively sandblasting the planet’s surface.

“Exoplanets are greatly expanding the planetary menagerie,” says Fegley. “They provide us with all sorts of oddball planets that one could only imagine in the past.”

Back to Top

Citizen Science

In 1999, the Space Sciences Laboratory at the University of California, Berkeley, launched a program called SETI@home, which enabled thousands of home computers to sift through unprocessed data collected by radio telescopes in the search for artificial signals from other worlds. The program, which today comprises of more than five million SETI@home volunteers, essentially constitutes one of the world’s largest supercomputers, and is an excellent example of how a network of home computers can be amazingly efficient by spreading around the processing burden.

Another ongoing citizen science program, Galaxy Zoo, uses tens of thousands of home computers, but in this case the processing power lies mostly in the human brain. Originally launched in 2007, Galaxy Zoo is the most massive galaxy classification project ever undertaken. Galaxies come in a mind-boggling variety of shapes and combinations, making classifying them extremely difficult. Galaxy Zoo was tasked with classifying the nearly one million galaxies swept up in the Sloan Digital Sky Survey, the most ambitious survey in the history of astronomy. Computers and the Internet have been instrumental in circulating images of galaxies to thousands of volunteers, but the success of Galaxy Zoo depends on the pattern recognition faculties particular to humans.

“Supercomputers, or even my battered old laptop, can beat large numbers of humans for sheer processing power,” says Chris Lintott, an astrophysicist at Oxford University. “The problem we were facing is one of pattern recognition, and for this task, we humans have a huge advantage.”


“Exoplanets are greatly expanding the planetary menagerie,” says Bruce Fegley, Jr. “They provide us with all sorts of oddball planets that one could only imagine in the past.”


Computerized methods of galaxy classification have been applied to galaxy surveys before, including the Sloan Digital Sky Survey, but their galaxy classification is correct about only 80% of the time, says Lintott. “That may not sound too bad, but the missing 20% are often those that are unusual and thus contain the most information,” he says. As Galaxy Zoo team members have discovered, humans are adept at spotting unusual details. For example, volunteers noticed some small, round green objects, now known as “peas,” in the background of some Galaxy Zoo images. “They turn out to be the sites of the most efficient star formation in the present-day universe, but it was only because our volunteers spotted them that we knew to study them,” says Lintott.

A new incarnation of the project, Galaxy Zoo 2, is now under way and includes a supernova search adjunct. A veritable flood of data is expected. “In the long run, I think we’ll need a combination of man and machine,” says Lintott. “This sort of development is going to be necessary to cope with the next generation of sky surveys, which should produce data rates of up to 30 terabytes per night.”

*  Further Reading

Governato, F., Brook, C., Mayer, L., Brooks, A., Rhee, G., Wadsley, J., Jonsson, P., Willman, B., Stinson, G., Quinn, T., Madau, P.
Bulgeless dwarf galaxies and dark matter cores from supernova-driven outfows, Nature 463, 7278, Jan. 14, 2010.

Benedict, G.F., McArthur, B.E., Forveille, T., Delfosse, X., Nelan, E., Butler, R.P., Spiesman, W., Marcy, G., Goldman, B., Perrier, C., Jefferys, W.H., Mayor, M.
A mass for the extrasolar planet Gl 876b determined from Hubble Space Telescope fine guidance sensor 3 astrometry and high-precision radial velocities, The Astrophysical Journal 581, 2, Dec. 20, 2002.

Schaefer, L. and Fegley, B.
Chemistry of silicate atmospheres of evaporating super-earths, The Astrophysical Journal 703, 2, June 2, 2009.

Raddick, M. J., Bracey, G., Gay, P.L., Lintott, C.J., Murray, P., Schawinski, K., Szalay, A.S., Vandenberg, J.
Galaxy Zoo: exploring the motivations of citizen science volunteers, Astronomy Education Review 9, 1, Feb. 18, 2010.

Astronomy Education Review
Web site http://aer.aip.org/aer/

Back to Top

Back to Top

Back to Top

Figures

UF1 Figure. Gliese 667 C and a pair of stars that were discovered by the High Accuracy Radial Velocity Planet searcher, the spectrograph for ESO’s 3.6-meter telescope at the La Silla Observatory in Chile.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More