Abstract: Heterogeneity poses many challenges for privacy. For example, standard differential privacy techniques may exacerbate issues of bias in diverse datasets, and can make it difficult to efficiently optimize problems with varying geometry.
In this talk, we explore two approaches to push the privacy-utility frontier in the face of heterogeneity. First, we discuss methods for private multi-task learning, which can enable heterogeneous data to be accurately and fairly modeled while retaining meaningful privacy guarantees. Second, we propose a framework for differentially private adaptive optimization, which can allow for many of the benefits lost when applying state-of-the-art optimizers in private settings to be regained. We end by discussing open questions at the intersection of privacy and heterogeneity.
Bio: Virginia Smith is an assistant professor in the Machine Learning Department at Carnegie Mellon University. She was a postdoc at Stanford University and received a PhD in Computer Science from University of California, Berkeley.