Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Tree-to-Sequence Attentional Neural Machine Translation

About

Most of the existing Neural Machine Translation (NMT) models focus on the conversion of sequential data and do not directly use syntactic information. We propose a novel end-to-end syntactic NMT model, extending a sequence-to-sequence model with the source-side phrase structure. Our model has an attention mechanism that enables the decoder to generate a translated word while softly aligning it with phrases as well as words of the source sentence. Experimental results on the WAT'15 English-to-Japanese dataset demonstrate that our proposed model considerably outperforms sequence-to-sequence attentional NMT models and compares favorably with the state-of-the-art tree-to-string SMT system.

Akiko Eriguchi, Kazuma Hashimoto, Yoshimasa Tsuruoka• 2016

Related benchmarks

TaskDatasetResultRank
Machine Translation (Chinese-to-English)NIST 2003 (MT-03)
BLEU41.24
52
Machine Translation (Chinese-to-English)NIST MT-05 2005
BLEU37.86
42
Machine TranslationIWSLT English-Vietnamese 2015 (tst2013)
BLEU28.51
23
Machine TranslationNIST Chinese-English MT03-MT06 (test)
Average Score41.42
18
Machine Translation (Chinese-to-English)NIST MT 2004
BLEU40.35
15
Machine Translation (Chinese-to-English)NIST MT-06
BLEU37.32
15
Machine TranslationNIST MT04
BLEU43.38
10
Code SummarizationPython GitHub dataset (test)
BLEU-10.1887
9
SQL-to-text generationWikiSQL
BLEU-426.67
6
Machine TranslationNIST MT05
BLEU41.04
4
Showing 10 of 11 rows

Other info

Follow for update