Skip to yearly menu bar Skip to main content


Poster

Unified Breakdown Analysis for Byzantine Robust Gossip

Renaud Gaucher · Aymeric Dieuleveut · Hadrien Hendrikx

West Exhibition Hall B2-B3 #W-612
[ ] [ ]
Thu 17 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

In decentralized machine learning, different devices communicate in a peer-to-peer manner to collaboratively learn from each other's data. Such approaches are vulnerable to misbehaving (or Byzantine) devices. We introduce F-RG, a general framework for building robust decentralized algorithms with guarantees arising from robust-sum-like aggregation rules F. We then investigate the notion of breakdown point, and show an upper bound on the number of adversaries that decentralized algorithms can tolerate. We introduce a practical robust aggregation rule, coined CS+, such that CS+-RG has a near-optimal breakdown. Other choices of aggregation rules lead to existing algorithms such as ClippedGossip or NNA. We give experimental evidence to validate the effectiveness of CS+-RG and highlight the gap with NNA, in particular against a novel attack tailored to decentralized communications.

Lay Summary:

Can many computers train a model together, when some send wrong information? Is it harder when computers communicate with only a small number of the other ones? We investigate these challenges in a setting where computers communicate directly, without relying on a central server.Our approach builds on a simple idea: each computer can treat its information as a trustworthy reference and consider messages that differ too much as suspicious. We show that it can be combined with many robust averaging options, and the resulting algorithm is highly resilient: the performance remains strong when the number of misbehaving computers doesn’t exceed a threshold. For an averaging option we introduce, this threshold is optimal up to a factor of 2! Experimental results in machine learning support our theory.This work opens the way to more secure distributed systems, for collaborative machine learning, and beyond!

Chat is not available.