Skip to yearly menu bar Skip to main content


Poster

Neural Genetic Search in Discrete Spaces

Hyeonah Kim · Sanghyeok Choi · Jiwoo Son · Jinkyoo Park · Changhyun Kwon

East Exhibition Hall A-B #E-3407
[ ] [ ]
Tue 15 Jul 4:30 p.m. PDT — 7 p.m. PDT

Abstract:

Effective search methods are crucial for improving the performance of deep generative models at test time. In this paper, we introduce a novel test-time search method, Neural Genetic Search (NGS), which incorporates the evolutionary mechanism of genetic algorithms into the generation procedure of deep models. The core idea behind NGS is its crossover, which is defined as parent-conditioned generation using trained generative models. This approach offers a versatile and easy-to-implement search algorithm for deep generative models. We demonstrate the effectiveness and flexibility of NGS through experiments across three distinct domains: routing problems, adversarial prompt generation for language models, and molecular design.

Lay Summary:

Many real-world problems in various domains, ranging from planning delivery routes to designing new molecules or generating text, involve searching and optimizing over complex, discrete spaces. Deep generative models, especially those that build solutions step-by-step (sequentially), have made real progress in these areas. But in practice, these models often settle for “good enough” answers, generating solutions in a single pass without further refinement.In this work, we propose Neural Genetic Search (NGS), a method that brings the evolutionary ideas of genetic algorithms into the test-time generation process of deep models. NGS builds and evolves a population of solutions by combining good candidates using two key operations: crossover, where the model generates new sequences using only the tokens seen in selected parents, and mutation, which allows variation beyond the parent tokens. These operations are applied directly through the model’s generation process, enabling structured and model-aware exploration of the solution space.NGS is simple, model-agnostic, and easy to integrate with generative models that generate discrete sequences. It effectively turns a generator into a more powerful search tool capable of iterative refinement, and holds promise for a wide range of applications, particularly in light of the growing popularity of autoregressive generative models.

Chat is not available.