Modeling Localness for Self-Attention Networks
About
Self-attention networks have proven to be of profound value for its strength of capturing global dependencies. In this work, we propose to model localness for self-attention networks, which enhances the ability of capturing useful local context. We cast localness modeling as a learnable Gaussian bias, which indicates the central and scope of the local region to be paid more attention. The bias is then incorporated into the original attention distribution to form a revised distribution. To maintain the strength of capturing long distance dependencies and enhance the ability of capturing short-range dependencies, we only apply localness modeling to lower layers of self-attention networks. Quantitative and qualitative analyses on Chinese-English and English-German translation tasks demonstrate the effectiveness and universality of the proposed approach.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Machine Translation | WMT En-De 2014 (test) | BLEU29.18 | 379 | |
| Machine Translation | WMT14 En-De newstest2014 (test) | BLEU29.2 | 65 | |
| Machine Translation | WMT Chinese-English 2017 (test) | BLEU25.28 | 21 | |
| Machine Translation | WMT Zh-En (test) | BLEU Score25.03 | 14 |