Poster
in
Workshop: 3rd Workshop on High-dimensional Learning Dynamics (HiLD)
The Price of Robustness: Stable Classifiers Need Overparameterization
Jonas von Berg · Adalbert Fono · Massimiliano Datres · Sohir Maskey · Gitta Kutyniok
Abstract:
In this work, we show that class stability, the expected distance of an input to the decision boundary, captures what classical capacity measures, such as weight norms, fail to explain. In particular, we prove a generalization bound that improves inversely with the class stability. As a corollary, interpreting class stability as a quantifiable notion of robustness, we derive a law of robustness for classification that extends results by Bubeck and Selke beyond smoothness assumptions to discontinuous functions. Specifically, any interpolating model with $p \approx n$ parameters on $n$ data points must be unstable, implying that high stability requires substantial overparameterization. Preliminary experiments support this theory: empirical stability increases with model size, while traditional norm-based measures remain uninformative.
Chat is not available.