Towards Neural Phrase-based Machine Translation
About
In this paper, we present Neural Phrase-based Machine Translation (NPMT). Our method explicitly models the phrase structures in output sequences using Sleep-WAke Networks (SWAN), a recently proposed segmentation-based sequence modeling method. To mitigate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Different from existing neural machine translation (NMT) approaches, NPMT does not use attention-based decoding mechanisms. Instead, it directly outputs phrases in a sequential order and can decode in linear time. Our experiments show that NPMT achieves superior performances on IWSLT 2014 German-English/English-German and IWSLT 2015 English-Vietnamese machine translation tasks compared with strong NMT baselines. We also observe that our method produces meaningful phrases in output languages.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Machine Translation | IWSLT De-En 2014 (test) | BLEU30.1 | 146 | |
| Machine Translation | IWSLT German-to-English '14 (test) | BLEU Score30.08 | 110 | |
| Machine Translation | IWSLT En-De 2014 (test) | BLEU25.36 | 92 | |
| Machine Translation | IWSLT English-Vietnamese 2015 (tst2013) | BLEU28.07 | 23 | |
| Machine Translation | IWSLT En-Vi 2015 (test) | BLEU28.1 | 17 | |
| Machine Translation (English-to-Vietnamese) | TED 2013 (test) | BLEU27.69 | 6 |