Poster
Efficient Diffusion Models for Symmetric Manifolds
Oren Mangoubi · Neil He · Nisheeth K. Vishnoi
West Exhibition Hall B2-B3 #W-1014
Diffusion models have recently achieved remarkable success in generating synthetic data, such as realistic images, audio, and video. These models work well when data lives in flat, Euclidean space. However, in many scientific and engineering applications—such as molecular drug discovery, quantum physics, and robotics—data naturally lies on curved, non-Euclidean spaces known as manifolds. Training diffusion models on these spaces is often computationally expensive, requiring either many gradient computations or exponentially large runtimes in the data dimension.In this paper, we develop a new type of diffusion model that is efficient to train and sample from on a broad class of non-Euclidean spaces called symmetric manifolds, including spheres, tori, and the special orthogonal and unitary groups. Our key idea is to design a diffusion process that incorporates a curvature-aware covariance term. This allows us to simulate the diffusion by projecting simple Euclidean noise onto the manifold, significantly reducing computational cost.As a result, each step of our training algorithm requires only a constant number of gradient evaluations and a number of arithmetic operations nearly-linear in the data dimension, narrowing the performance gap between manifold-based and Euclidean diffusion models. We also prove that our model satisfies a probabilistic smoothness condition that guarantees accurate and stable sample generation.Experiments on synthetic datasets show that our method trains faster and produces higher-quality samples compared to previous approaches, across a variety of manifolds commonly used in scientific applications.