Skip to yearly menu bar Skip to main content


Poster

FedBEns: One-Shot Federated Learning based on Bayesian Ensemble

Jacopo Talpini · Marco Savi · Giovanni Neglia

East Exhibition Hall A-B #E-1406
[ ] [ ]
Thu 17 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract:

One-Shot Federated Learning (FL) is a recent paradigm that enables multiple clients to cooperatively learn a global model in a single round of communication with a central server. In this paper, we analyze the One-Shot FL problem through the lens of Bayesian inference and propose FedBEns, an algorithm that leverages the inherent multimodality of local loss functions to find better global models.Our algorithm leverages a mixture of Laplace approximations for the clients' local posteriors, which the server then aggregates to infer the global model. We conduct extensive experiments on various datasets, demonstrating that the proposed method outperforms competing baselines that typically rely on unimodal approximations of the local losses.

Lay Summary:

In traditional machine learning, data is sent to a central server to train a model. A different approach is Federated Learning, where multiple clients (like different institutions or different devices) work together to train a model collaboratively without sharing their data, ensuring user privacy. This study introduces FedBEns, an improved method for these clients to combine their knowledge and train a global model in a single communication round with the central server. FedBEns uses a technique based on Bayesian inference to better capture the unique learning patterns of each client. Tests on various datasets show that FedBEns produces more accurate models than existing methods.

Chat is not available.