Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Methods and Opportunities at Small Scale (MOSS)

Exploring Diverse Solutions for Underdetermined Problems

Eric Volkmann · Andreas Radler · Johannes Brandstetter · Arturs Berzins

Keywords: [ data-free ] [ theory-informed learning ] [ diversity ] [ mode collapse ]


Abstract:

This work explores the utility of a recently proposed diversity loss in training generative, theory-informed models on underdetermined problems with multiple solutions. Unlike data-driven methods, theory-informed learning often operates in data-free settings, optimizing neural networks to satisfy objectives and constraints. We demonstrate how this diversity loss encourages the generation of diverse solutions across various example problems, effectively avoiding mode collapse and enabling exploration of the solution space.

Chat is not available.