Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Query-Key Normalization for Transformers

About

Low-resource language translation is a challenging but socially valuable NLP task. Building on recent work adapting the Transformer's normalization to this setting, we propose QKNorm, a normalization technique that modifies the attention mechanism to make the softmax function less prone to arbitrary saturation without sacrificing expressivity. Specifically, we apply $\ell_2$ normalization along the head dimension of each query and key matrix prior to multiplying them and then scale up by a learnable parameter instead of dividing by the square root of the embedding dimension. We show improvements averaging 0.928 BLEU over state-of-the-art bilingual benchmarks for 5 low-resource translation pairs from the TED Talks corpus and IWSLT'15.

Alex Henry, Prudhvi Raj Dachapally, Shubham Pawar, Yuxuan Chen• 2020

Related benchmarks

TaskDatasetResultRank
Click-Through Rate PredictionIndustrial
AUC74.68
104
Conversion Rate (CVR) PredictionIndustrial-scale Recommender Dataset
AUC91.04
14
Add-to-Cart PredictionIndustrial-scale Recommender Dataset
AUC86.7
14
Showing 3 of 3 rows

Other info

Follow for update