Skip to main content
Seminar | Mathematics and Computer Science

A Statistical Learning Perspective on Model Reduction

LANS Seminar

Abstract: Stochastic closure models aim to make timely predictions with uncertainty quantified. We discuss the statistical learning framework that achieves this goal by accounting for the effects of the unresolved scales. A fundamental idea is the approximation of the discrete-time flow map for the dynamics of the resolved variables. The flow map, which encodes the effects of the unresolved scales, is a functional of the history of resolved scales, as suggested by the Mori-Zwanzig formalism. Thus, its inference faces the curse of dimensionality. We investigate a semi-parametric approach that derives parametric models from the structure of the full model. We show that this approach leads to effective reduced models for deterministic and stochastic PDEs, such as the Kuramoto-Sivashisky equation and the viscous stochastic Burgers equations. In particular, we highlight the shift from the classical numerical methods (such as the nonlinear Galerkin method) to statistical learning and discuss space-time reduction.