Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Scaling Up Intervention Models

Thompson Sampling in Function Spaces via Neural Operators

Rafael Oliveira · Xuesong Wang · Kian Ming Chai · Edwin V. Bonilla


Abstract:

We propose an extension of Thompson sampling to optimization problems over function spaces where the objective is a known functional of an unknown operator's output. We assume that functional evaluations are inexpensive, while queries to the operator (such as running a high-fidelity simulator) are costly. Our algorithm employs a sample-then-optimize approach using neural operator surrogates. This strategy avoids explicit uncertainty quantification by treating trained neural operators as approximate samples from a Gaussian process. We provide novel theoretical convergence guarantees based on Gaussian processes in the infinite-dimensional setting, under minimal assumptions. We benchmark our method against existing baselines on functional optimization tasks involving partial differential equations and other nonlinear operator-driven phenomena, demonstrating improved sample efficiency and competitive performance.

Chat is not available.