Poster
in
Workshop: Machine Learning for Wireless Communication and Networks (ML4Wireless)
Uplink-Aware Federated Learning Based on Model Pruning in Satellite Networks
Chenyu Xu · Yijie Mao · Xiong Wang · Jingjing Zhang · Yuanming Shi
Satellite federated learning (SFL) allows satellites to collaboratively train models without sharing raw data, enhancing privacy and reducing communication costs. Traditional SFL requires a ground station (GS) to upload models to satellites, under the premise of adequate ground-satellite uplink (GSUL) resources. However, this assumption does not hold in dense LEO constellations, where frequent command interaction or parameter delivery make the bandwidth-constrained uplink a bottleneck. This work proposes satellite federated learning with uplink scheduling and model pruning (FedLSMP). The key idea behind this is jointly optimizing the GSUL bandwidth allocation plan and model compression ratio to maximize the approximated loss reduction, adhering to bandwidth constraints. Finally, numerical results demonstrate that FedLSMP improves convergence rates while reducing GSUL bandwidth usage, achieving higher overall effectiveness compared with conventional SFL approaches.