Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unsupervised Dense Retrieval with Relevance-Aware Contrastive Pre-Training

About

Dense retrievers have achieved impressive performance, but their demand for abundant training data limits their application scenarios. Contrastive pre-training, which constructs pseudo-positive examples from unlabeled data, has shown great potential to solve this problem. However, the pseudo-positive examples crafted by data augmentations can be irrelevant. To this end, we propose relevance-aware contrastive learning. It takes the intermediate-trained model itself as an imperfect oracle to estimate the relevance of positive pairs and adaptively weighs the contrastive loss of different pairs according to the estimated relevance. Our method consistently improves the SOTA unsupervised Contriever model on the BEIR and open-domain QA retrieval benchmarks. Further exploration shows that our method can not only beat BM25 after further pre-training on the target corpus but also serves as a good few-shot learner. Our code is publicly available at https://github.com/Yibin-Lei/ReContriever.

Yibin Lei, Liang Ding, Yu Cao, Changtong Zan, Andrew Yates, Dacheng Tao• 2023

Related benchmarks

TaskDatasetResultRank
Information RetrievalBEIR
TREC-COVID0.405
59
Open-domain retrievalNQ
Recall@2069.4
9
Open-domain retrievalWQ
Recall@2068
9
Open-domain retrievalTriviaQA
Recall@2075.9
9
Personalized Long-Form GenerationLONGLAMP Product Review (user-based split)
ROUGE-131.76
9
Personalized Long-Form GenerationLONGLAMP Topic Writing (user-based)
ROUGE-10.2533
9
Personalized Long-Form GenerationProduct Review
ROUGE-131.76
9
Personalized Long-Form GenerationTopic Writing
ROUGE-125.33
9
Personalized Long-Form GenerationLONGLAMP Abstract Generation (user-based split)
ROUGE-131.68
9
Personalized Long-Form GenerationAbstract Generation
ROUGE-10.3168
9
Showing 10 of 10 rows

Other info

Code

Follow for update