Poster
Temperature-Annealed Boltzmann Generators
Henrik Schopmans · Pascal Friederich
West Exhibition Hall B2-B3 #W-112
Efficient sampling of unnormalized probability densities such as theBoltzmann distribution of molecular systems is a longstanding challenge.Next to conventional approaches like molecular dynamics or Markov chainMonte Carlo, variational approaches, such as training normalizing flows withthe reverse Kullback-Leibler divergence, have been introduced. However, suchmethods are prone to mode collapse and often do not learn to sample the fullconfigurational space. Here, we present temperature-annealed Boltzmanngenerators (TA-BG) to address this challenge. First, we demonstrate thattraining a normalizing flow with the reverse Kullback-Leibler divergence athigh temperatures is possible without mode collapse. Furthermore, weintroduce a reweighting-based training objective to anneal the distribution to lower target temperatures.We apply this methodology to three molecular systems of increasing complexity and, compared to the baseline, achieve better results in almost all metrics while requiring up to three times fewer target energy evaluations. For the largest system, our approach is the only method that accurately resolves the metastable states of the system.
Understanding how molecules behave and interact is key to breakthroughs in areas like drug discovery. Computer simulations of physical systems are like virtual microscopes that offer insights into a system's behavior without requiring expensive lab experiments. However, to cover all relevant interactions and processes, the movement of the atoms in a system typically has to be simulated for a very long time, requiring large amounts of computational resources.Instead of simulating the system over time, variational sampling methods train generative machine learning models to directly match the probability distribution of the physical system. This is a promising approach to make computer simulations more efficient. However, such variational methods often show a problem called mode collapse, where only a small fraction of the system's behavior is learned by the machine learning model.Our work presents a simple and effective fix: we start training the model under easier conditions, simulating a higher-temperature environment where the mode collapse problem disappears. Then, we gradually "cool" the system down, guiding the model toward realistic behaviors at the target temperature. This approach allows efficient exploration of the atomistic behavior of molecular systems, without mode collapse.