Schunck, N.; O’Neal, J.; Grosskopf, M.; Lawrence, E.; Wild, S.
Nuclear density functional theory is the prevalent theoretical framework for accurately describing nuclear properties at the scale of the entire chart of nuclides. Given an energy functional and a many-body scheme (e.g., single- or multireference level), the predictive power of the theory depends mostly on how the parameters of the energy functionals have been calibrated with experimental data. Expanded algorithms and computing power have enabled recent optimization protocols to include data in deformed nuclei in order to optimize the coupling constants of the energy functional. The primary motivation of this work is to test the robustness of such protocols with respect to some of the technical and numerical details of the underlying calculations, especially when the calibration explores a large parameter space. To this end, we quantify the effect of these uncertainties on both the optimization and statistical emulation of composite objective functions. We also emphasize that Bayesian calibration can provide better estimates of the theoretical errors used to define objective functions.