Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd AI for Math Workshop @ ICML 2025

Learning an Effective Premise Retrieval Model for Efficient Mathematical Formalization

Yicheng Tao · Haotian Liu · Shanwen Wang · Hongteng Xu


Abstract:

Formalized mathematics has recently garnered significant attention for its ability to assist mathematicians across various fields.Premise retrieval, as a common step in mathematical formalization, has been a challenge, particularly for inexperienced users.Existing retrieval methods that facilitate natural language queries require a certain level of mathematical expertise from users, while approaches based on formal languages (e.g., Lean) typically struggle with the scarcity of training data, hindering the training of effective and generalizable retrieval models.In this work, we introduce a novel method that leverages data extracted from Mathlib to train a lightweight and effective premise retrieval model.In particular, the proposed model embeds queries (i.e., proof state provided by Lean) and premises in a latent space, featuring a tokenizer specifically trained on formal corpora.The model is learned in a contrastive learning framework, in which a fine-grained similarity calculation method and a re-ranking module are applied to enhance the retrieval performance.Experimental results demonstrate that our model outperforms existing baselines, achieving higher accuracy while maintaining a lower computational load. Based on our model, we have released an open-source search engine for retrieving premises based on formal states.

Chat is not available.