Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

On Long-Tailed Phenomena in Neural Machine Translation

About

State-of-the-art Neural Machine Translation (NMT) models struggle with generating low-frequency tokens, tackling which remains a major challenge. The analysis of long-tailed phenomena in the context of structured prediction tasks is further hindered by the added complexities of search during inference. In this work, we quantitatively characterize such long-tailed phenomena at two levels of abstraction, namely, token classification and sequence generation. We propose a new loss function, the Anti-Focal loss, to better adapt model training to the structural dependencies of conditional text generation by incorporating the inductive biases of beam search in the training process. We show the efficacy of the proposed technique on a number of Machine Translation (MT) datasets, demonstrating that it leads to significant gains over cross-entropy across different language pairs, especially on the generation of low-frequency words. We have released the code to reproduce our results.

Vikas Raunak, Siddharth Dalmia, Vivek Gupta, Florian Metze• 2020

Related benchmarks

TaskDatasetResultRank
Machine TranslationIWSLT De-En 2014 (test)
BLEU34.2
146
Machine TranslationIWSLT En-De 2014 (test)
BLEU27.9
92
Machine TranslationWMT En-De '14
BLEU29.72
89
Machine TranslationWMT14 DE-EN (test)
BLEU30.3
28
Machine TranslationWMT19 Zh-En (test)
BLEU25.64
22
Machine TranslationWMT En-De 2014 (test)
BLEU Score27.5
10
Machine TranslationIWSLT14 en-fr (test)
BLEU40.5
10
Machine TranslationIWSLT14 fr-en (test)
BLEU39.5
10
Showing 8 of 8 rows

Other info

Follow for update