Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction

About

Neural information retrieval (IR) has greatly advanced search and other knowledge-intensive language tasks. While many neural IR methods encode queries and documents into single-vector representations, late interaction models produce multi-vector representations at the granularity of each token and decompose relevance modeling into scalable token-level computations. This decomposition has been shown to make late interaction more effective, but it inflates the space footprint of these models by an order of magnitude. In this work, we introduce ColBERTv2, a retriever that couples an aggressive residual compression mechanism with a denoised supervision strategy to simultaneously improve the quality and space footprint of late interaction. We evaluate ColBERTv2 across a wide range of benchmarks, establishing state-of-the-art quality within and outside the training domain while reducing the space footprint of late interaction models by 6--10$\times$.

Keshav Santhanam, Omar Khattab, Jon Saad-Falcon, Christopher Potts, Matei Zaharia• 2021

Related benchmarks

TaskDatasetResultRank
Multi-hop Question Answering2WikiMultihopQA
EM35.4
387
Multi-hop Question Answering2Wiki
Exact Match33.4
152
Passage retrievalMsMARCO (dev)
MRR@1039.7
116
Information RetrievalBEIR (test)
FiQA-2018 Score35.6
90
Question AnsweringNQ (test)--
86
RetrievalMS MARCO (dev)
MRR@100.397
84
Question Answering2WikiMultiHopQA (test)--
81
Passage RankingMS MARCO (dev)
MRR@1039.7
73
RerankingMS MARCO (dev)
MRR@100.397
71
Information RetrievalSciFact (test)
NDCG@100.693
65
Showing 10 of 75 rows
...

Other info

Code

Follow for update