Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improving Attention-Based Handwritten Mathematical Expression Recognition with Scale Augmentation and Drop Attention

About

Handwritten mathematical expression recognition (HMER) is an important research direction in handwriting recognition. The performance of HMER suffers from the two-dimensional structure of mathematical expressions (MEs). To address this issue, in this paper, we propose a high-performance HMER model with scale augmentation and drop attention. Specifically, tackling ME with unstable scale in both horizontal and vertical directions, scale augmentation improves the performance of the model on MEs of various scales. An attention-based encoder-decoder network is used for extracting features and generating predictions. In addition, drop attention is proposed to further improve performance when the attention distribution of the decoder is not precise. Compared with previous methods, our method achieves state-of-the-art performance on two public datasets of CROHME 2014 and CROHME 2016.

Zhe Li, Lianwen Jin, Songxuan Lai, Yecheng Zhu• 2020

Related benchmarks

TaskDatasetResultRank
Handwritten Mathematical Expression RecognitionCROHME 2016 (test)
Expression Rate (Exp)54.6
164
Handwritten Mathematical Expression RecognitionCROHME 2014 (test)
Expression Rate (Exp)56.59
156
Handwritten Mathematical Expression RecognitionCROHME 2014
Error Rate56.6
47
Showing 3 of 3 rows

Other info

Follow for update