Cardinal simulation software receives 2023 R&D100 award
Cardinal, an open-source simulation software package for high-fidelity multiphysics solutions, has been named a 2023 R&D 100 Award winner. Developed by a team of researchers from Argonne’s Mathematics and Computer Science (MCS), Computational Science (CPS) and Nuclear Science and Engineering (NSE) divisions, Cardinal accelerates scientific discovery in nuclear fusion and fission energy with unstructured meshes and computer-aided design geometry, enabling first-of-a-kind scientific exploration.
One of the key components integrated in the Cardinal software is NekRS, a computational fluid dynamics solver developed in Argonne’s MCS division and used by Cardinal for fluid flow and thermal convection.
“NekRS leverages high-order discretizations on high-performance graphics processing units (GPUs) to provide Cardinal users with reduced simulation times, particularly on pre-exascale and exascale computing platforms,” said Misun Min, a computational mathematician in Argonne’s MCS division and head of the Center for Efficient Exascale Discretizations at Argonne.
Fluid flow and heat transfer simulations involve computing function values (fluid velocity, pressure, temperature, etc.) at many points on a computational grid for a region of interest. Typical large-scale simulations of turbulence can require millions or billions of gridpoint values to represent a complicated flow field. These values are updated at each time step to represent the time evolution of the solution.
The speed and accuracy of Cardinal’s NekRS solver derive from its use of high-order piecewise-polynomial functions that represent fields as function values at gridpoints.
“This high-order approach results in an order of magnitude savings in computing time compared with traditional numerical approaches,” said Paul Fischer, a senior computational scientist in the MCS division and a principal investigator on the project. “NekRS’s use of high-order methods also means that users can expect greater accuracy than with low-order approaches for the same number of computational degrees of freedom.”
Another issue that high-resolution fluid flow simulations face is the need for enormous memory capacity. Using matrix-free methods, NekRS avoids having to calculate and manipulate large matrices. Thus, NekRS performs well on high-performance computing architectures that are limited by memory bandwidth.
Today the majority of the world’s fastest computers use GPU accelerators to tackle challenging science problems. NekRS is fast both on single GPUs and on thousands of GPUs working in concert, with close to 100% efficiency.
“This capability opens doors to scientific discovery at a scale unmatched by any competitor,” Min said. “With NekRS on current exascale platforms, users can tackle simulations 50–100x larger than possible in the past.”
Figure 1 shows parallel scalability for Cardinal’s NekRS solver applied to fluid flow in a full reactor core comprising 352,635 fuel pebbles. “At this rate,” Fischer said, “one can simulate a complete exchange of fluid in just six hours of wall-clock time – enough to allow engineers to perform multiple simulations at different design points.”
Advanced reactors also require seamless integration of a diverse set of physics. Cardinal integrates the NekRS computational fluid dynamics solver with the OpenMC neutral particle transport solver in a library of additional physics kernels needed to model nuclear energy systems. Both NekRS and OpenMC are mapped to an unstructured mesh and then spatially interpolated to other coupled physics tools in the form of source terms, boundary conditions and multiscale models.
“The work enabled by NekRS, coupled with OpenMC in the Cardinal software, is a key step toward meeting the needs of a clean and sustainable energy future,” Min said.
The full team of Cardinal developers includes the following: April Novak, Derek Gaston, Paul Fischer, Richard Martineau, Elia Merzari, Misun Min, Paul Romano, Dillon Shaver and Patrick Shriwise.