Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Actionable Interpretability

Bayesian Influence Functions for Scalable Data Attribution

Philipp Kreer · Wilson Wu · Maxwell Adam · Zach Furman · Jesse Hoogland

[ ] [ Project Page ]
Sat 19 Jul 1 p.m. PDT — 2 p.m. PDT

Abstract:

Classical influence functions face significant challenges when applied to deep neural networks, primarily due to singular Hessians and high-dimensional parameter spaces. We propose the local Bayesian influence function, an extension of classical influence functions that replaces Hessian inversion with loss landscape statistics that can be estimated via stochastic Gradient MCMC. This approach captures higher-order interactions among parameters and scales efficiently to neural networks with billions of parameters. Initial results on language and vision models indicate performance comparable to state-of-the-art methods like EK-FAC, often with substantially reduced computational costs.

Chat is not available.