Skip to yearly menu bar Skip to main content


Poster

Ergodic Generative Flows

Leo Brunswic · Mateo ClĂ©mente · Rui Heng Yang · Adam Sigal · Amir Rasouli · Yinchuan Li

East Exhibition Hall A-B #E-3201
[ ] [ ]
Wed 16 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract:

Generative Flow Networks (GFNs) were initially introduced on directed non-acyclic graphs to sample from an unnormalized distribution density. Recent works have extended the theoretical framework for generative methods allowing more flexibility and enhancing application range. However, many challenges remain in training GFNs in continuous settings and for imitation learning (IL), including intractability of flow-matching loss, limited tests of non-acyclic training, and the need for a separate reward model in imitation learning. The present work proposes a family of generative flows called Ergodic Generative Flows (EGFs) which are used to address the aforementioned issues. First, we leverage ergodicity to build simple generative flows with finitely many globally defined transformations (diffeomorphisms) with universality guarantees and tractable flow-matching loss (FM loss). Second, we introduce a new loss involving cross-entropy coupled to weak flow-matching control, coined KL-weakFM loss. It is designed for IL training without a separate reward model. We evaluate IL-EGFs on toy 2D tasks and real-world datasets from NASA on the sphere, using the KL-weakFM loss. Additionally, we conduct toy 2D reinforcement learning experiments with a target reward, using the FM loss.

Lay Summary:

At the most abstract level, the generative models belong to a family (GANs, Diffusion models, GFlowNets,...) which specifies how the neural net is trained and how the computations it carries out are used. GFlowNet applies transformations iteratively to modify a sample until it is satisfactory; the key part of the training is to build a heuristic that guides the choice of transformation at each step. Our work introduces Ergodic Generative Flows (EGFs), a new kind of GFN that simplifies training while preserving flexibility. EGFs rely on a small set of well-chosen transformations that can reach any part of the space through mixing, ensuring both expressiveness and mathematical tractability. This approach allows the use of GFlowNets for Generative modeling in a more direct fashion. More generally, our approach opens up a new way of building expressive generative models with large-step mixing transformations in small numbers, rather than small-step transformations in large numbers.

Chat is not available.