Skip to main content
Feature Story | Argonne National Laboratory

Messina discusses rewards, challenges for new exascale project

The exascale initiative has an ambitious goal: to develop supercomputers a hundred times more powerful than today’s systems.

That’s the kind of speed that can help scientists make serious breakthroughs in solar and sustainable energy technology, weather forecasting, batteries and more.

Last year, President Obama announced a unified National Strategic Computing Initiative to support U.S. leadership in high-performance computing; one key objective is to pave the road toward an exascale computing system.

The U.S. Department of Energy (DOE) has been charged with carrying out that role in an initiative called the Exascale Computing Project.

Argonne National Laboratory Distinguished Fellow Paul Messina has been tapped to lead the project, heading a team with representation from the six major participating DOE national laboratories: Argonne, Los Alamos, Lawrence Berkeley, Lawrence Livermore, Oak Ridge and Sandia. The project program office is located at Oak Ridge.

Messina, who has made fundamental contributions to modern scientific computing and networking  and previously served as the Director of Science for the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, for eight years, will now help usher in a new generation of supercomputers with the capabilities to change our everyday lives.

One of the project’s goals is to boost U.S. industry, so the Exascale Computing Project will be working with companies to make sure the project is in step with their goals and needs.

Engineers work on components of supercomputer Mira at Argonne Leadership Computing Facility.

Exascale-level computing could have an impact on almost everything, Messina said. It can help increase the efficiency of wind farms by determining the best locations and arrangements of turbines, as well as optimizing the design of the turbines themselves. It can also help severe weather forecasters make their models more accurate and could boost research in solar energy, nuclear energy, biofuels and combustion, among many other fields.

For example, it’s clear from some of our pilot projects that exascale computing power could help us make real progress on batteries,” Messina said.

Brute computing force is not sufficient, however, Messina said; We also need mathematical models that better represent phenomena and algorithms that can efficiently implement those models on the new computer architectures.”

Given those advances, researchers will be able to sort through the massive number of chemical combinations and reactions to identify good candidates for new batteries.

Computing can help us optimize. For example, let’s say that we know we want a manganese cathode with this electrolyte; with these new supercomputers, we can more easily find the optimal chemical compositions and proportions for each,” he said.

Exascale computing will help researchers get a handle on what’s happening inside systems where the chemistry and physics are extremely complex. To stick with the battery example: the behavior of liquids and components within a working battery is intricate and constantly changing as the battery ages.

We use approximations in many of our calculations to make the computational load lighter,” Messina said, but what if we could afford to use the more accurate — but more computationally expensive — methods?”

In addition, Messina said that one of the project’s goals is to boost U.S. industry, so the Exascale Computing Project will be working with companies to make sure the project is in step with their goals and needs.

Messina spoke further on the four areas where the project will focus its efforts.


The applications software to tackle these larger computing challenges will often evolve from current codes, but will need substantial work, Messina said.

First, simulating more challenging problems will require some brand-new methods and algorithms. Second, the architectures of these new computers will be different from the ones we have today, so to be able to use existing codes effectively, the codes will have to be modified. This is a daunting task for many of the teams that use scientific supercomputers today.

These are huge, complex applications, often with literally millions of lines of code,” Messina said. Maybe they took the team 500 person-years to write, and now you need to modify it to take advantage of new architectures, or even translate it into a different programming language.”

The project will support teams that can provide the people-power to tackle a number of applications of interest, he said. For example, data-intensive calculations are expected to be increasingly important and will require new software and hardware features.

The goal is to have mission-critical” applications to be ready when the first exascale systems are deployed, Messina said.

The teams will also identify both what new supporting software is needed, and ways that the hardware design could be improved to work with that software before the computers themselves are ever built. This co-design” element is central for reaching the full potential of exascale, he said.


The software ecosystem will need to evolve both to support new functionality demanded by applications and to use new hardware features efficiently,” Messina said.

The project will enhance the software stack that DOE Office of Science and NNSA applications rely on and evolve it for exascale, as well as conduct R&D on tools and methods to boost productivity and portability between systems.

For example, many tasks are the same from scientific application to application and are embodied as elements of software libraries. Teams writing new code use the libraries for efficiency — so you don’t have to be an expert in every single thing,” Messina explained.

Thus, improving libraries that do numerical tasks or visualizations, data analytics and program languages, for example, would benefit many different users,” he said.

Teams working on these components will work closely with the applications taskforce, he said. We’ll need good communication between these teams so everyone knows what’s needed and how to use the tools provided.”

In addition, as researchers are able to get more and more data from experiments, they’ll need software infrastructure to more effectively deal with that data.


While the computers themselves are massive, they aren’t a big part of the commercial market.

Scientific computers are a niche market, so we make our own specs to get the best results for computational science applications,” Messina said. That’s what we do with most of our scientific supercomputers, including here at Argonne when we collaborated with IBM and Lawrence Livermore National Laboratory on the design of Mira, and we believe it really paid off.”

For example, companies are used to building huge banks of servers for business computing applications, for which it’s not usually important for one cabinet’s worth of chips to be able to talk to another one; For us, it matters a lot,” he said.

This segment will work with computer vendors and hardware technology providers to accelerate the development of particular features for scientific and engineering applications — not just those DOE is interested in, but also priorities for other federal agencies, academia and industry, Messina said.

Prepping exascale sites

Supercomputers need very special accommodations — you can’t stick one just anywhere. They need a good deal of electricity and cooling infrastructure; they take up a fair amount of square footage, and all of the flooring needs to be reinforced. This effort will work to develop sites for computers with this kind of footprint. 

The Exascale Computing Project is a complex project with many stakeholders and moving parts, Messina said. The challenge will be to effectively coordinate activities in many different sites in a relatively short time frame — but the rewards are clear.”

The project will be jointly funded by the U.S. Department of Energy’s Office of Science and the National Nuclear Security Administration’s Office of Defense Programs.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.