Talking-Heads Attention
About
We introduce "talking-heads attention" - a variation on multi-head attention which includes linearprojections across the attention-heads dimension, immediately before and after the softmax operation.While inserting only a small number of additional parameters and a moderate amount of additionalcomputation, talking-heads attention leads to better perplexities on masked language modeling tasks, aswell as better quality when transfer-learning to language comprehension and question answering tasks.
Noam Shazeer, Zhenzhong Lan, Youlong Cheng, Nan Ding, Le Hou• 2020
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Abstractive Text Summarization | CNN/Daily Mail (test) | ROUGE-L38.06 | 169 | |
| Machine Translation | WMT En-Ro 2016 (test) | BLEU34.35 | 39 | |
| Grammar Error Correction | CoNLL (test) | Precision64.32 | 5 |
Showing 3 of 3 rows