Poster
EAGLES: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification
Zitong Shi · Guancheng Wan · Wenke Huang · Guibin Zhang · He Li · Carl Yang · Mang Ye
East Exhibition Hall A-B #E-2803
Federated graph learning faces serious scalability issues as model sizes and graph complexities continue to grow. In this work, we propose EAGLES to address the growing computational and communication challenges in federated graph learning. We design a unified framework that not only reduces redundant information in both graph structures and model parameters but also respects the privacy constraints of real-world applications. Through carefully crafted sparsification strategies and cross-client structural alignment, EAGLES enables efficient and scalable training while preserving performance. The method demonstrates substantial improvements, achieving up to 80% reduction in training and communication costs while maintaining competitive accuracy.