Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model

About

Studies on grammatical error correction (GEC) have reported the effectiveness of pretraining a Seq2Seq model with a large amount of pseudodata. However, this approach requires time-consuming pretraining for GEC because of the size of the pseudodata. In this study, we explore the utility of bidirectional and auto-regressive transformers (BART) as a generic pretrained encoder-decoder model for GEC. With the use of this generic pretrained model for GEC, the time-consuming pretraining can be eliminated. We find that monolingual and multilingual BART models achieve high performance in GEC, with one of the results being comparable to the current strong results in English GEC. Our implementations are publicly available at GitHub (https://github.com/Katsumata420/generic-pretrained-GEC).

Satoru Katsumata, Mamoru Komachi• 2020

Related benchmarks

TaskDatasetResultRank
Grammatical Error CorrectionCoNLL 2014 (test)
F0.5 Score63
207
Grammatical Error CorrectionBEA shared task 2019 (test)
F0.5 Score66.1
139
Grammatical Error CorrectionJFLEG
GLEU57.3
47
Grammatical Error CorrectionCoNLL 2014
F0.563
39
Grammatical Error CorrectionRULEC-GEC Russian (test)
F0.5 Score44.36
14
Grammar Error CorrectionFalko-Merlin German (test)
Precision (P)73.97
6
Grammatical Error CorrectionAKCES-GEC Czech (test)
Precision78.48
6
Showing 7 of 7 rows

Other info

Code

Follow for update