Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning an Effective Premise Retrieval Model for Efficient Mathematical Formalization

About

Formalized mathematics has recently garnered significant attention for its ability to assist mathematicians across various fields. Premise retrieval, as a common step in mathematical formalization, has been a challenge, particularly for inexperienced users. Existing retrieval methods that facilitate natural language queries require a certain level of mathematical expertise from users, while approaches based on formal languages (e.g., Lean) typically struggle with the scarcity of training data, hindering the training of effective and generalizable retrieval models. In this work, we introduce a novel method that leverages data extracted from Mathlib to train a lightweight and effective premise retrieval model. In particular, the proposed model embeds queries (i.e., proof state provided by Lean) and premises in a latent space, featuring a tokenizer specifically trained on formal corpora. The model is learned in a contrastive learning framework, in which a fine-grained similarity calculation method and a re-ranking module are applied to enhance the retrieval performance. Experimental results demonstrate that our model outperforms existing baselines, achieving higher accuracy while maintaining a lower computational load. We have released an open-source search engine based on our retrieval model at https://premise-search.com/. The source code and the trained model can be found at https://github.com/ruc-ai4math/Premise-Retrieval.

Yicheng Tao, Haotian Liu, Shanwen Wang, Hongteng Xu• 2025

Related benchmarks

TaskDatasetResultRank
Theorem RetrievalAugmented Proof State
R@10.0499
5
Theorem RetrievalRaw Proof State
Recall@13.3
5
Showing 2 of 2 rows

Other info

Follow for update