Learning to Prove Theorems by Learning to Generate Theorems
About
We consider the task of automated theorem proving, a key AI task. Deep learning has shown promise for training theorem provers, but there are limited human-written theorems and proofs available for supervised learning. To address this limitation, we propose to learn a neural generator that automatically synthesizes theorems and proofs for the purpose of training a theorem prover. Experiments on real-world tasks demonstrate that synthetic data from our approach improves the theorem prover and advances the state of the art of automated theorem proving in Metamath. Code is available at https://github.com/princeton-vl/MetaGen.
Mingzhe Wang, Jia Deng• 2020
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Theorem Proving | set.mm (test) | Proofs Found (Test)600 | 14 | |
| Theorem Proving (Relevance Ranking) | set.mm (val) | Top-1 Accuracy53.2 | 12 | |
| Theorem Proving (Substitution Network) | set.mm (val) | Prob0.6847 | 12 | |
| Automated Theorem Proving | Metamath (val) | Performance21.16 | 6 | |
| Formal Theorem Proving | Metamath set.mm (val) | Performance Score21.16 | 3 | |
| Theorem Proving | iset.mm (test) | Proofs Found398 | 2 | |
| Theorem Proving (Relevance Ranking) | iset.mm (val) | Top-1 Accuracy45.1 | 2 | |
| Theorem Proving (Substitution Network) | iset.mm (val) | Probability0.2554 | 2 |
Showing 8 of 8 rows