Poster
in
Workshop: ICML 2025 Workshop on Collaborative and Federated Agentic Workflows (CFAgentic @ ICML'25)
Federated Submodular Maximization: Improved Communication Rounds and Bit Complexity
Akbar Rafiey · Sreeharsh Namani · Neophytos Charalambides
Abstract:
Subset selection is a fundamental problem in machine learning, data mining, game theory, and economics, where the objective is to select a subset of elements from a collection that maximizes a given submodular function. This renders submodular maximization an important and widely studied problem. However, in massively distributed settings with sensitive data, centralized iterative optimization is often infeasible and disfavored due to privacy concerns and computational constraints. To address these challenges, we propose a novel federated algorithm for maximizing a monotone submodular function where the data and computations are distributed across multiple clients. Our approach enables optimization without requiring a central server to directly access the local functions. To handle scalability, we employ client subsampling for aggregation and low-bit communication updates, reducing communication complexity -- which is critical in large scale and high-dimensional settings. Our federated algorithm guarantees a $\tfrac{1}{2}$-approximation, and effectively captures the trade-off between the number of participating clients and bit complexity. We then propose a second algorithm for improved efficiency, which requires only logarithmically many communication rounds between the clients and the central server, while preserving the $\tfrac{1}{2}$-approximation guarantee. The approximation guarantees of our algorithms can be improved to $1-\tfrac{1}{e}$ through amplification techniques. Our methods are validated numerically, demonstrating their effectiveness in real-world settings and corroborating our guarantees.
Chat is not available.