Skip to yearly menu bar Skip to main content


Poster

Federated Learning for Feature Generalization with Convex Constraints

Dongwon Kim · Donghee Kim · Sung Kuk Shyn · Kwangsu Kim

East Exhibition Hall A-B #E-2005
[ ] [ ] [ Project Page ]
Wed 16 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

Federated learning (FL) often struggles with generalization due to heterogeneous client data. Local models are prone to overfitting their local data distributions, and even transferable features can be distorted during aggregation. To address these challenges, we propose FedCONST, an approach that adaptively modulates update magnitudes based on the global model’s parameter strength. This prevents over-emphasizing well-learned parameters while reinforcing underdeveloped ones. Specifically, FedCONST employs linear convex constraints to ensure training stability and preserve locally learned generalization capabilities during aggregation. A Gradient Signal-to-Noise Ratio (GSNR) analysis further validates FedCONST's effectiveness in enhancing feature transferability and robustness. As a result, FedCONST effectively aligns local and global objectives, mitigating overfitting and promoting stronger generalization across diverse FL environments, achieving state-of-the-art performance.

Lay Summary:

Federated learning (FL) allows multiple devices to collaboratively train machine learning models without sharing their data. However, when data is distributed unevenly across devices, models often overfit to local patterns and fail to generalize well. Our method, FedCONST, addresses this challenge by adjusting how much each part of the model is updated, based on how strongly that part has already learned. This helps prevent over-updating well-trained parts while encouraging weaker parts to catch up. By adding lightweight constraints during training, FedCONST improves stability and helps preserve useful features. As a result, it achieves stronger generalization and state-of-the-art performance across a variety of FL scenarios.

Chat is not available.