Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning

About

Grammatical error correction can be viewed as a low-resource sequence-to-sequence task, because publicly available parallel corpora are limited. To tackle this challenge, we first generate erroneous versions of large unannotated corpora using a realistic noising function. The resulting parallel corpora are subsequently used to pre-train Transformer models. Then, by sequentially applying transfer learning, we adapt these models to the domain and style of the test set. Combined with a context-aware neural spellchecker, our system achieves competitive results in both restricted and low resource tracks in ACL 2019 BEA Shared Task. We release all of our code and materials for reproducibility.

Yo Joong Choe, Jiyeon Ham, Kyubyong Park, Yeoil Yoon• 2019

Related benchmarks

TaskDatasetResultRank
Grammatical Error CorrectionCoNLL 2014 (test)
F0.5 Score60.33
207
Grammatical Error CorrectionBEA shared task 2019 (test)
F0.5 Score69
139
Grammatical Error CorrectionBEA 2019 (dev)
F0.5 Score53.27
19
Grammatical Error CorrectionW&I+L (dev)
F0.552.79
9
Grammatical Error CorrectionW&I+L (test)
F0.569.06
8
Showing 5 of 5 rows

Other info

Code

Follow for update