U.S. National Laboratories (Argonne, Lawrence Livermore, and Oak Ridge) possess three of the world’s fastest exascale supercomputers (as of the November 2024 Top500 ranking), which are capable of performing one quintillion (a billion billion) operations per second. One of the highest priorities (among many) for these supercomputers is mapping and characterizing the 95% of the universe that is unseen but inferred to be there—namely, dark matter (~27%) and dark energy (~68%).
The remaining ~5% of the universe is all that physics’ Standard Model is known to describe, according to principle investigator Salman Habib of the U.S. Department of Energy (DoE) sponsored Dark Sky Mining project at Argonne National Laboratory.
“Modeling large-scale phenomena on exascale supercomputers allows us to produce digital twins of the extragalactic sky. These cosmic maps contain not just the visible portions of galaxies and intergalactic gases, but also the structure of the invisible components of dark matter and dark energy,” said theoretical cosmology and astrophysics expert Gus Evrard of the University of Michigan, who is not a part of the Dark Sky Mining project. “By realizing many specific models covering dark-sector physics and galaxy formation dynamics,” Evrard said, “computational cosmologists are building the bridge necessary to perform sensitive inference tests from observational surveys of our extragalactic sky.”
So far, neither dark matter nor dark energy appear to emit or absorb light or electromagnetism, according to Evrard, which makes them invisible to humans. Both, however, have been inferred from gravitational anomalies.
What is Dark Energy?
Dark energy was inferred after Edwin Hubble discovered in 1929 that the universe was expanding faster at its edges when it should have been slowing down, according to both the Standard Model (of particles) and Einstein’s General Relativity Theory (of gravity). Many cosmologists speculated on the precise mechanism of expansion, but most theories were shelved in 1998 when the universe’s expansion was observed to be accelerating by the Supernova Cosmology Project at Lawrence Berkeley National Laboratory (LBNL). Today, most cosmologists believe the universe’s accelerating expansion is due to an opposing force to gravity—pushing matter apart—that exists uniformly in space, the name of which was coined as Dark Energy by theoretical cosmologist Michael Turner.
Since then, the accelerated expansion of the universe was confirmed by the Dark Energy Survey (DES) conducted from 2013 to 2019 by more than 400 scientists worldwide. The collaboration used a specially built 570-megapixel digital camera mounted on the Blanco four-meter telescope at Cerro Tololo (altitude: 7,241 feet) in the Chilean Andes. A follow-up survey is currently under way with the Dark Energy Spectroscopic Instrument (DESI) funded by the DoE and operated by LBNL. This new sensor has a bank of 5,000 robotically positioned optical fibers feeding wide bandwidth spectrographs retrofitted to the Mayall Telescope on top of Kitt Peak (altitude: 6,883 feet) in the Quinlan Mountains of Arizona.
So far, DESI has confirmed the acceleration of the universe’s expansion and supports the postulation that dark energy’s density is constant, and so grows as the universe expands. Thus, instead of the universe shrinking from the force of gravity after the initial acceleration of the Big Bang, dark energy appears to grow as the universe expands, thus accelerating its expansion further, according to DESI.

One aspect of dark energy that is growing within the Standard Model is vacuum energy, a quantum effect of the spontaneous production of particles in the vacuum of space, called quantum fluctuation. Vacuum energy is currently postulated to be too small to account for all dark energy, although DESI project director Michael Levi recently speculated that “dark energy is evolving with time,” making simulations with exascale supercomputers all the more important in solving the mystery.
What is Dark Matter?
“Dark matter is one of the greatest mysteries in physics,” said William Detmold, professor of physics at the Massachusetts Institute of Technology (MIT), a member of the Dark Sky Mining project. “I am spending a great deal of time on the calculations we need to do with exascale computers in order to understand what we can learn about the nature of dark matter if and when we do detect it in laboratory experiments.”
Like dark energy, dark matter is inferred; however, unlike dark energy, which appears to be evenly distributed throughout the vacuum of space, dark matter appears to be clumped around regular matter, as confirmed by the Cosmic Evolution Survey (CES), an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. Using the Hubble Space Telescope, the CES was based on Einstein’s postulation (since confirmed) that light bends around massive objects. Dark matter is inferred to be clumped around the most massive objects in space as measured with the extra bending of light around them than can be estimated by their visible mass. The CES mapped the density of dark matter in the universe by measuring the gravitational lensing effect around distant galaxies. The gravitational lensing survey inferred a map of the large scale structure of dark matter in the universe.
DES and CES datasets have been analyzed by the Hardware/Hybrid Accelerated Cosmology Code project, using software that was nominated as a finalist for ACM’s Gordon Bell Prize in computing.
Together with the growing DESI dataset, DES and CES comprise the largest databases from which the Dark Sky Mining project can work to further characterize the nature of dark energy and dark matter using exascale supercomputers.
“With exascale supercomputers we can use more detailed cosmological simulations to look more deeply into the astrophysical data that enables us to infer the existence of dark matter and dark energy,” said Habib. “Exascale supercomputers can be used to create hundreds, even thousands, of realistic simulations, then pick the ones that are closest to the observed data, refine those simulations even further, and repeat.”
The Dark Sky Mining project works to develop simulations of the universe so detailed that they can help laboratory physicists explore the observable manifestations of dark matter and dark energy. CERN’s High-Luminosity Large Hadron Collider, for instance, used its Atlas detector to detect the Higgs Boson particle (associated with gravity) and is also using Atlas in attempts to detect dark matter particles within the Standard Model, such as Lambda cold dark matter (Lambda CDM), postulated to have formed during the galactic formation and evolution period after the Big Bang.
In addition to CERN, many other scientists are searching for dark matter and dark energy, potentially benefiting from open access to exascale supercomputers like Aurora (announced earlier this year). For instance, MIT’s David Kaiser and Elba Alonso-Monsalve recently speculated that primordial black holes (first proposed by Stephen Hawking) could account for some, if not most, of the missing dark matter. Primordial black holes, as implied by their name, could only form within the first second after the Big Bang and were hypothesized by Hawking to exist at the center of stars (including the Sun).
Other unproven physics theories attempting to account for dark matter and/or dark energy could potentially benefit from the super-accurate exascale computer simulations of Dark Sky Mining. For instance, quintessence scalar fields, postulated to contribute additional energy to the expanding vacuum of space, and string theory, which suggests dark matter exists in higher dimensions than can now be detected, can both benefit from high-resolution simulations by exascale supercomputers.
R. Colin Johnson is a Kyoto Prize Fellow who has worked as a technology journalist for two decades.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment