Skip to yearly menu bar Skip to main content


Poster

Zebra: In-Context Generative Pretraining for Solving Parametric PDEs

Louis Serrano · Armand Kassaï Koupaï · Thomas Wang · Pierre ERBACHER · patrick gallinari

West Exhibition Hall B2-B3 #W-106
[ ] [ ]
Thu 17 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract:

Solving time-dependent parametric partial differential equations (PDEs) is challenging for data-driven methods, as these models must adapt to variations in parameters such as coefficients, forcing terms, and initial conditions. State-of-the-art neural surrogates perform adaptation through gradient-based optimization and meta-learning to implicitly encode the variety of dynamics from observations. This often comes with increased inference complexity. Inspired by the in-context learning capabilities of large language models (LLMs), we introduce Zebra, a novel generative auto-regressive transformer designed to solve parametric PDEs without requiring gradient adaptation at inference. By leveraging in-context information during both pre-training and inference, Zebra dynamically adapts to new tasks by conditioning on input sequences that incorporate context example trajectories. As a generative model, Zebra can be used to generate new trajectories and allows quantifying the uncertainty of the predictions. We evaluate Zebra across a variety of challenging PDE scenarios, demonstrating its adaptability, robustness, and superior performance compared to existing approaches.

Lay Summary:

Solving complex equations that describe how things change over time, like weather patterns or fluid flow (called time-dependent parametric PDEs), is hard for computer models, especially when factors like initial conditions or forces can vary. Existing advanced computer models try to adjust to these changes by "learning" from observations, but this can make them slow to use.Inspired by how large language models can understand new situations without being explicitly re-trained, we developed a new model called Zebra. Zebra is a generative model, meaning it can create new solutions. It learns to adapt to different situations by looking at examples of how the PDE behaved in similar contexts, both during its initial training and when it's solving a new problem. This means Zebra doesn't need to be retrained for each new scenario, making it much faster. Zebra can also generate multiple possible solutions, which helps us understand the uncertainty in its predictions. We have tested Zebra on many difficult PDE problems, and it has shown to be adaptable, reliable, and better than current methods.

Chat is not available.