Developed by a team of researchers from Argonne’s Mathematics and Computer Science (MCS), Computational Science (CPS) and Nuclear Science and Engineering (NSE) divisions, Cardinal combines state-of-the-art physics solvers in a flexible, scalable and accurate manner to provide users with critical modeling and simulation capabilities for analyzing advanced reactors.
Simulation software is not new. It has been developed for key areas of nuclear reactor physics –neutron transport, fluid flow and heat transfer – and with different numerical methods. “But Cardinal goes a step further, providing an integrated framework for simulating the interaction of these segregated physics domains,” said April Novak, a Maria Goeppert Mayer fellow in Argonne’s Computational Science (CPS) division.
One of the key solvers integrated in the Cardinal software is NekRS, a computational fluid dynamics solver used by Cardinal for fluid flow and thermal convection. NekRS is being developed in the Center for Efficient Exascale Discretizations (CEED) as part of the U.S. Department of Energy Exascale Computing Project (ECP). NekRS leverages high-order discretizations on high-performance graphics processing units (GPUs) to provide Cardinal users with reduced simulation times, particularly on pre-exascale and exascale computing platforms that are being deployed at Argonne, Livermore and Oak Ridge National Laboratories.
Integrated with NekRS in Cardinal is another key solver, OpenMC, a Monte Carlo radiation transport solver. Typical large-scale simulations of nuclear reactors can require millions or billions of simulated particle tracks to capture the physics with high precision. Through massive parallelization of the particle tracks, OpenMC – originally developed by Paul Romano and his team at the Massachusetts Institute of Technology – can simulate the particle histories concurrently, resulting in shorter runtimes. Romano, currently a computational scientist in Argonne’s CPS division, noted that the OpenMC project has expanded to become a communitywide effort and is part of an ECP application project to enable coupled Monte Carlo neutron transport–computational fluid dynamics at the exascale.
Both NekRS and OpenMC are highly scalable and efficient. “These capabilities mean that users can run simulations on supercomputers with hundreds to thousands of GPUs, performing multiple simulations at different design points,” said Misun Min, who is the Argonne lead for CEED.
Cardinal also provides unprecedented geometric flexibility. By mapping the solutions of NekRS and OpenMC into unstructured meshes, Cardinal can communicate physics fields among its solvers. “Cardinal software is completely agnostic of the geometry, which means that there are no a priori limitations to particular reactor types,” Novak said.
Cardinal integrates NekRS and OpenMC in the Multiphysics Object-Oriented Simulation Environment (MOOSE), a library of additional physics kernels developed at the Idaho National Laboratory. The NekRS and OpenMC solutions are mapped to an unstructured mesh, which MOOSE then interpolates in the form of source terms or boundary conditions to other coupled physics tools, enabling Cardinal to tackle a broad range of applications in a single, fully coupled system.
Cardinal is being used at universities and industries nationwide for studies including nuclear fission and fusion reactor modeling, thermal shock, material behavior under irradiation and dynamic power systems modeling. “The physical phenomena that can be simulated with Cardinal ranges from neutron interactions at the atomic scale to whole system response of nuclear reactors coupled to electric grids at the kilometer scale,” said Elia Merzari, a nuclear engineer with a joint appointment at Pennsylvania State University and Argonne.
“The software bridges the gap between fundamental supercomputing research and scientific problem-solving on real-world applications,” Novak said.
The full team of Cardinal developers includes the following: April Novak, Paul Fischer, Richard Martineau, Elia Merzari, Misun Min, Paul Romano, Dillon Shaver and Patrick Shriwise.
For a list of all the R&D 100 2022 finalists, click here.