Skip to yearly menu bar Skip to main content


Poster

BARNN: A Bayesian Autoregressive and Recurrent Neural Network

Dario Coscia · Max Welling · Nicola Demo · Gianluigi Rozza

East Exhibition Hall A-B #E-1405
[ ] [ ]
Tue 15 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

Autoregressive and recurrent networks have achieved remarkable progress across various fields, from weather forecasting to molecular generation and Large Language Models. Despite their strong predictive capabilities, these models lack a rigorous framework for addressing uncertainty, which is key in scientific applications such as PDE solving, molecular generation and machine l earning Force Fields.To address this shortcoming we present BARNN: a variational Bayesian Autoregressive and Recurrent Neural Network. BARNNs aim to provide a principled way to turn any autoregressive or recurrent model into its Bayesian version. BARNN is based on the variational dropout method, allowing to apply it to large recurrent neural networks as well. We also introduce a temporal version of theā€œVariational Mixtures of Posteriorsā€ prior (tVAMP-prior) to make Bayesian inference efficient and well-calibrated. Extensive experiments on PDE modelling and molecular generation demonstrate that BARNN not only achieves comparable or superior accuracy compared to existing methods, but also excels in uncertainty quantification and modelling long-range dependencies.

Lay Summary:

Sequence models in deep learning drive advances in areas like weather forecasting and molecular design, but they lack reliable ways to quantify uncertainty, which is crucial for scientific applications. We introduce BARNN, a Bayesian approach that equips these models with well-calibrated uncertainty estimates, enabling more trustworthy predictions in scientific applications.

Chat is not available.