Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MathBERT: A Pre-Trained Model for Mathematical Formula Understanding

About

Large-scale pre-trained models like BERT, have obtained a great success in various Natural Language Processing (NLP) tasks, while it is still a challenge to adapt them to the math-related tasks. Current pre-trained models neglect the structural features and the semantic correspondence between formula and its context. To address these issues, we propose a novel pre-trained model, namely \textbf{MathBERT}, which is jointly trained with mathematical formulas and their corresponding contexts. In addition, in order to further capture the semantic-level structural features of formulas, a new pre-training task is designed to predict the masked formula substructures extracted from the Operator Tree (OPT), which is the semantic structural representation of formulas. We conduct various experiments on three downstream tasks to evaluate the performance of MathBERT, including mathematical information retrieval, formula topic classification and formula headline generation. Experimental results demonstrate that MathBERT significantly outperforms existing methods on all those three tasks. Moreover, we qualitatively show that this pre-trained model effectively captures the semantic-level structural information of formulas. To the best of our knowledge, MathBERT is the first pre-trained model for mathematical formula understanding.

Shuai Peng, Ke Yuan, Liangcai Gao, Zhi Tang• 2021

Related benchmarks

TaskDatasetResultRank
Mathematical Information RetrievalNTCIR Wiki-Formula 12
BPref (Full)0.614
17
Headline GenerationEXEQ-300k (test)
BLEU-449.4
5
Showing 2 of 2 rows

Other info

Follow for update