Skip to main content
Feature Story | Argonne National Laboratory

Exploring the dark universe at the speed of petaflops

An astonishing 95% of our universe is made of up dark energy and dark matter. Understanding the physics of this sector is the foremost challenge in cosmology today. Sophisticated simulations of the evolution of the universe play a crucial role.

The primary lens through which scientists look at the night sky is no longer only a telescope—it’s also a supercomputer. The new and coming generations of supercomputers will finally be capable of modeling the universe in the detail and volume required by astronomical surveys of the sky that are now underway, or soon will be.

Scientists use large cosmological simulations to test theories about the structure of the universe and the evolution of the distribution of galaxies and clusters of galaxies. State-of-the-art supercomputers let cosmologists make predictions and test them against data from powerful telescopes and space probes. Two decades of surveying the sky have culminated in the celebrated Cosmological Standard Model. Yet two of the model’s key pillars—dark matter and dark energy, together accounting for 95% of the universe—remain mysterious. A research team led by Argonne is tackling this mystery, aided by some of the world’s fastest supercomputers.

To model the distribution of matter in the universe, the researchers are running some of the largest, most complex simulations of the large-scale structure of the universe ever undertaken.

The Argonne team has run a 1.1-trillion-particle simulation on half a million processor cores of Mira, Argonne’s new Blue Gene/Q supercomputer. The team was among a few science teams from across the country to gain early access to the system, which is now online.

In a very real sense, we only understand 4% of the universe. To basic scientists like us, that’s a crime—that’s not allowed.”

Argonne physicist Steve Kuhlmann

The power and speed of supercomputers and simulation codes  have significantly advanced over the past decade. Mira enables cosmology runs with greater resolution and accuracy on much larger simulation volumes—giving researchers the ability to confront theory with observational data from wide-area cosmological surveys.

Exploring the cosmic structure of the dark universe is an enormously complex problem. As the universe expands, gravitational attraction causes matter to coalesce and form structures—first sheets, then filaments where the sheets intersect, and then clumps where the filaments meet. As time progresses, one can begin to see more clearly the basic structure of an enormous web of voids, filaments, and clumps. Simulations at Argonne have calculated this web-like structure, the so-called cosmic web, in a cube of simulated space more than 13 trillion light-years across.

Because these trillions of particles are meant to trace matter in the entire universe, they are extremely massive, something in the range of a billion suns,” said Argonne computational physicist Salman Habib, the project’s director. We know the gravitational dynamics of how these tracer particles interact, and so we evolve them forward to see what kind of densities and structure they produce, as a result of both gravity and the expansion of the universe. That’s essentially what the simulation does: it takes an initial condition and moves it forward to the present to see if our ideas about structure formation in the universe are correct.”

Dark energy may be the most profound mystery in all of science.”

 —University of Chicago cosmologist Michael Turner

Next-generation sky surveys will map billions of galaxies to explore the physics of the dark universe.” Science requirements for these surveys demand simulations at extreme scales in order to resolve galaxy-scale mass concentrations over the observational volumes of sky surveys.

A key aspect of the Argonne project involves developing a major simulation suite covering approximately 100 different cosmological scenarios and combining them in a framework that can generate predictions for any scenario within the range covered by the original runs.

This research is supported by the U.S. Department of Energy’s Office of High Energy Physics, Advanced Scientific Computing Research, and Argonne’s Laboratory Directed Research and Development program.



What is dark matter?
Scientists studying distant galaxies noticed that something we can’t see is exerting a huge gravitational force on things we can see—like stars and supernovae. We named this dark matter” because it doesn’t emit or absorb light. But is it ordinary matter that we don’t have a way to measure, or is it a truly new substance?

What is dark energy?
In 1998, two teams of astronomers (one from Lawrence Berkeley National Lab) discovered that not only is the universe expanding, it’s expanding faster and faster as time goes on. This means that some other force than gravity is acting on the universe. We understand very little about this mysterious force, but sky surveys and computational simulations can help bring us closer.

HACC“ing away at the code
Few supercomputers in the world have the muscle to simulate complex problems in a reasonable time span. The Blue Gene/Q belongs to an elite class of machines now coming online at DOE national laboratories for this express purpose. The recent trillion-particle science run conducted on Mira used 32 racks of the computer, which is two-thirds of its total size.

Essential to these simulation runs is the team’s simulation code framework called HACC, short for Hardware/Hybrid Accelerated Cosmology Codes. HACC is similar to other codes written to study how the individual particles of a complex system move and interact over time. Unlike those codes, however, HACC is designed for extreme performance and in a way that easily adapts to different computers.

HACC’s performance relies on its ability to accurately and efficiently calculate the forces applied to a large number of interacting particles. Cosmology simulations can use more than hundreds of billions of particles.

The Blue Gene architecture allows an experienced user to program it with a reasonable level of effort and get a good performance,” said ALCF performance engineer Vitali Morozov. The team was recently awarded 40 million core-hours on Mira through the U.S. Department of Energy’s Innovative and Novel Computational Impact on Theory and Experiment program, which is supported by the Office of Science’s Office of Advanced Scientific Computing Research and provides access to computing power and resources to support computationally intensive, large-scale projects to researchers from industry, academia, and government research facilities. Another 150 million core-hours were awarded under the Early Science Project program.