Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Encoder-Decoder Models Can Benefit from Pre-trained Masked Language Models in Grammatical Error Correction

About

This paper investigates how to effectively incorporate a pre-trained masked language model (MLM), such as BERT, into an encoder-decoder (EncDec) model for grammatical error correction (GEC). The answer to this question is not as straightforward as one might expect because the previous common methods for incorporating a MLM into an EncDec model have potential drawbacks when applied to GEC. For example, the distribution of the inputs to a GEC model can be considerably different (erroneous, clumsy, etc.) from that of the corpora used for pre-training MLMs; however, this issue is not addressed in the previous methods. Our experiments show that our proposed method, where we first fine-tune a MLM with a given GEC corpus and then use the output of the fine-tuned MLM as additional features in the GEC model, maximizes the benefit of the MLM. The best-performing model achieves state-of-the-art performances on the BEA-2019 and CoNLL-2014 benchmarks. Our code is publicly available at: https://github.com/kanekomasahiro/bert-gec.

Masahiro Kaneko, Masato Mita, Shun Kiyono, Jun Suzuki, Kentaro Inui• 2020

Related benchmarks

TaskDatasetResultRank
Grammatical Error CorrectionCoNLL 2014 (test)
F0.5 Score65.2
207
Grammatical Error CorrectionBEA shared task 2019 (test)
F0.5 Score69.8
139
Grammatical Error CorrectionJFLEG
GLEU62
47
Grammatical Error CorrectionCoNLL M2 14
Precision (P)72.6
27
Grammatical Error CorrectionBEA 19
Precision72.3
12
Grammatical Error CorrectionFCE M2 (test)
Precision65
10
Grammatical Error CorrectionFCGEC
EM10.88
9
Showing 7 of 7 rows

Other info

Code

Follow for update