
Research Article
Multi-Model Federated Learning with Provable Guarantees
@INPROCEEDINGS{10.1007/978-3-031-31234-2_13, author={Neelkamal Bhuyan and Sharayu Moharir and Gauri Joshi}, title={Multi-Model Federated Learning with Provable Guarantees}, proceedings={Performance Evaluation Methodologies and Tools. 15th EAI International Conference, VALUETOOLS 2022, Virtual Event, November 2022, Proceedings}, proceedings_a={VALUETOOLS}, year={2023}, month={5}, keywords={Federated Learning Distributed Learning Optimization}, doi={10.1007/978-3-031-31234-2_13} }
- Neelkamal Bhuyan
Sharayu Moharir
Gauri Joshi
Year: 2023
Multi-Model Federated Learning with Provable Guarantees
VALUETOOLS
Springer
DOI: 10.1007/978-3-031-31234-2_13
Abstract
Federated Learning (FL) is a variant of distributed learning where edge devices collaborate to learn a model without sharing their data with the central server or each other. We refer to the process of training multiple independent models simultaneously in a federated setting using a common pool of clients as multi-model FL. In this work, we propose two variants of the popular FedAvg algorithm for multi-model FL, with provable convergence guarantees. We further show that for the same amount of computation, multi-model FL can have better performance than training each model separately. We supplement our theoretical results with experiments in strongly convex, convex, and non-convex settings.
Copyright © 2022–2025 ICST