Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Modeling Localness for Self-Attention Networks

About

Self-attention networks have proven to be of profound value for its strength of capturing global dependencies. In this work, we propose to model localness for self-attention networks, which enhances the ability of capturing useful local context. We cast localness modeling as a learnable Gaussian bias, which indicates the central and scope of the local region to be paid more attention. The bias is then incorporated into the original attention distribution to form a revised distribution. To maintain the strength of capturing long distance dependencies and enhance the ability of capturing short-range dependencies, we only apply localness modeling to lower layers of self-attention networks. Quantitative and qualitative analyses on Chinese-English and English-German translation tasks demonstrate the effectiveness and universality of the proposed approach.

Baosong Yang, Zhaopeng Tu, Derek F. Wong, Fandong Meng, Lidia S. Chao, Tong Zhang• 2018

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU29.18
379
Machine TranslationWMT14 En-De newstest2014 (test)
BLEU29.2
65
Machine TranslationWMT Chinese-English 2017 (test)
BLEU25.28
21
Machine TranslationWMT Zh-En (test)
BLEU Score25.03
14
Showing 4 of 4 rows

Other info

Follow for update