In this talk, we obtain new computational insights into two classical areas of statistics: generalization and sampling. In the first part, we study generalization: the performance of a learning algorithm on unseen data. We define a notion of generalization for non-converging learning algorithms via the stability of loss statistics. This notion yields generalization bounds in a similar manner to classical algorithmic stability. Then, we show that spectral information from the training dynamics provides clues to generalization performance.
In the second part, we discuss a new construction of a solution to the measure transport problem. The new construction arises from an infinite-dimensional generalization of a Newton method to find the zero of a “score operator.”. We define such a score operator that gives the difference of the score -- gradient of logarithm of density -- of a transported distribution from the target score. The new construction is iterative, enjoys fast convergence under smoothness assumptions, and does not make a parametric ansatz on the transport map. It is appropriate for the variational inference setting, where the score is known, and for sampling certain chaotic dynamical systems, where a conditional score can be calculated even in the absence of a statistical model for the target.
Bio: Nisha Chandramoorthy is an Assistant Professor in the School of Computational Science and Engineering at Georgia Tech. She received her PhD in Computational Science and Engineering at MIT.