Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 3rd Workshop on High-dimensional Learning Dynamics (HiLD)

Data Free Metrics Are Not Reparameterisation Invariant Under the Critical and Robust Layer Phenomena

Gabryel Mason-Williams · Israel Mason-Williams · Fredrik Dahlqvist


Abstract:

Data-free methods for analysing and understanding the layers of neural networks have offered many metrics for quantifying notions of strong" versusweak" layers, with the promise of increased interpretability. We examine how robust data-free metrics are under random control conditions of critical and robust layers. Contrary to the literature, we find counter-examples that provide counter-evidence to the efficacy of data-free methods. We show that data-free metrics are not reparameterisation invariant in these conditions and lose predictive capacity across correlation measures, RMSE, Person Coefficient and Kendall's Tau measure. Thus, we argue that to understand neural networks fundamentally, we must rigorously analyse the interactions between data, weights, and resulting functions that contribute to their outputs -- contrary to traditional Random Matrix Theory perspectives.

Chat is not available.