Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval

About

Conducting text retrieval in a dense learned representation space has many intriguing advantages over sparse retrieval. Yet the effectiveness of dense retrieval (DR) often requires combination with sparse retrieval. In this paper, we identify that the main bottleneck is in the training mechanisms, where the negative instances used in training are not representative of the irrelevant documents in testing. This paper presents Approximate nearest neighbor Negative Contrastive Estimation (ANCE), a training mechanism that constructs negatives from an Approximate Nearest Neighbor (ANN) index of the corpus, which is parallelly updated with the learning process to select more realistic negative training instances. This fundamentally resolves the discrepancy between the data distribution used in the training and testing of DR. In our experiments, ANCE boosts the BERT-Siamese DR model to outperform all competitive dense and sparse retrieval baselines. It nearly matches the accuracy of sparse-retrieval-and-BERT-reranking using dot-product in the ANCE-learned representation space and provides almost 100x speed-up.

Lee Xiong, Chenyan Xiong, Ye Li, Kwok-Fung Tang, Jialin Liu, Paul Bennett, Junaid Ahmed, Arnold Overwijk• 2020

Related benchmarks

TaskDatasetResultRank
Open Question AnsweringNatural Questions (NQ) (test)
Exact Match (EM)46
134
Passage retrievalMsMARCO (dev)
MRR@1033
116
Document RankingTREC DL Track 2019 (test)
nDCG@1064.5
96
RetrievalMS MARCO (dev)
MRR@100.33
84
Open-domain Question AnsweringTriviaQA (test)--
80
Information RetrievalBEIR (test)
TREC-COVID Score73.3
76
Passage RankingMS MARCO (dev)
MRR@1033
73
RetrievalTREC DL 2019
NDCG@1067.7
71
RerankingMS MARCO (dev)
MRR@100.33
71
Passage retrievalTriviaQA (test)
Top-100 Acc85.3
67
Showing 10 of 91 rows
...

Other info

Code

Follow for update