Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Tiny Titans: The next wave of On-Device Learning for Foundation Models (TTODLer-FM)

First Provable Guarantees for Practical Private FL: Beyond Restrictive Assumptions

Egor Shulgin · Grigory Malinovsky · Sarit Khirirat · Peter Richtarik

[ ] [ Project Page ]
Fri 18 Jul 3 p.m. PDT — 3:45 p.m. PDT

Abstract: Federated Learning (FL) enables collaborative training on decentralized data. Differential Privacy (DP) is crucial for FL, but current private methods often rely on unrealistic assumptions (e.g., bounded gradients or heterogeneity), hindering practical application. Existing works that relax these assumptions typically neglect practical FL mainstays like partial client participation or multiple local updates. We introduce Fed- $\alpha$-NormEC, the first differentially private FL framework providing provable convergence and DP guarantees under standard assumptions while fully supporting these practical elements. Fed- $\alpha$ NormEC integrates local updates (full and incremental gradient steps), separate server and client stepsizes, and, crucially, partial client participa-tion-essential for real-world deployment and vital for privacy amplification. Our theoretical guarantees are corroborated by experiments on private deep learning tasks.

Chat is not available.