Skip to main content
Seminar | Mathematics and Computer Science

Adaptive Mixture Models for Personalized and Efficient Learning across Distributed Environments

Abstract: In today’s data-driven landscape, personalization, efficiency, and distributed learning have become paramount for tackling complex machine learning (ML) tasks. This talk presents a unified framework that includes three different training techniques: Federated Mixture-of-Experts (FedJETs), Mixture of Prompts (MoPs), and Independent Subnet Training (IST), to address these challenges. We will discuss FedJETs that leverage the diversity of clients to train specialized experts (different small ML models) on different subsets of classes and a gating function to route inputs to the most relevant expert(s). We will discuss MoPs that apply to parts of a larger model (like a large language model), identify relevant skills embedded in different groups of prompts, and dynamically weigh experts based on the target task, using a gating functionality. And, finally, to address the challenges of distributed ML, where data and models are partitioned across multiple machines, IST is introduced.

IST decomposes the original network into narrow subnetworks, which are trained locally before exchanging parameters to produce new subnets. The common theme across these works is the idea of Training a larger model via training smaller versions of it in a distributed fashion.” This talk presents a unified framework that combines FedJETs, MoPs, and IST, enabling personalized and efficient learning across distributed environments.

Bio: Anastasios (Tasos) Kyrillidis is a Noah Harding Assistant Professor at the Computer Science department at Rice University, and an MSR visiting research collaborator. He finished his PhD at the CS Department of EPFL (Switzerland).