Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

KERPLE: Kernelized Relative Positional Embedding for Length Extrapolation

About

Relative positional embeddings (RPE) have received considerable attention since RPEs effectively model the relative distance among tokens and enable length extrapolation. We propose KERPLE, a framework that generalizes relative position embedding for extrapolation by kernelizing positional differences. We achieve this goal using conditionally positive definite (CPD) kernels, a class of functions known for generalizing distance metrics. To maintain the inner product interpretation of self-attention, we show that a CPD kernel can be transformed into a PD kernel by adding a constant offset. This offset is implicitly absorbed in the Softmax normalization during self-attention. The diversity of CPD kernels allows us to derive various RPEs that enable length extrapolation in a principled way. Experiments demonstrate that the logarithmic variant achieves excellent extrapolation performance on three large language modeling datasets. Our implementation and pretrained checkpoints are released at https://github.com/chijames/KERPLE.git.

Ta-Chung Chi, Ting-Han Fan, Peter J. Ramadge, Alexander I. Rudnicky• 2022

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText-103
PPL17.56
146
Language ModelingarXiv (test)
PPL4.45
137
Language ModelingGitHub (test)
Perplexity2.42
113
Language ModelingOpenWebText2 (test)
Perplexity16.2
104
Showing 4 of 4 rows

Other info

Code

Follow for update