Skip to main content
Publication

Tournament-Based Pretraining to Accelerate Federated Learning

Authors

Baughman, Matt; Hudson, Nathaniel; Chard, Ryan; Bauer, Andr; Foster, Ian; Chard, Kyle

Abstract

Advances in hardware, proliferation of compute at the edge, and data creation at unprecedented scales have made federated learning (FL) necessary for the next leap forward in pervasive machine learning. For privacy and network reasons, large volumes of data remain stranded on endpoints located in geographically austere (or at least austere network-wise) locations. However, challenges exist to the effective use of these data. To solve the system and functional level challenges, we present an three novel variants of a serverless federated learning framework. We also present tournament-based pretraining, which we demonstrate significantly improves model performance in some experiments. Overall, these extensions to FL and our novel training method enable greater focus on science rather than ML development.