Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ConTextual Masked Auto-Encoder for Dense Passage Retrieval

About

Dense passage retrieval aims to retrieve the relevant passages of a query from a large corpus based on dense representations (i.e., vectors) of the query and the passages. Recent studies have explored improving pre-trained language models to boost dense retrieval performance. This paper proposes CoT-MAE (ConTextual Masked Auto-Encoder), a simple yet effective generative pre-training method for dense passage retrieval. CoT-MAE employs an asymmetric encoder-decoder architecture that learns to compress the sentence semantics into a dense vector through self-supervised and context-supervised masked auto-encoding. Precisely, self-supervised masked auto-encoding learns to model the semantics of the tokens inside a text span, and context-supervised masked auto-encoding learns to model the semantical correlation between the text spans. We conduct experiments on large-scale passage retrieval benchmarks and show considerable improvements over strong baselines, demonstrating the high efficiency of CoT-MAE. Our code is available at https://github.com/caskcsg/ir/tree/main/cotmae.

Xing Wu, Guangyuan Ma, Meng Lin, Zijia Lin, Zhongyuan Wang, Songlin Hu• 2022

Related benchmarks

TaskDatasetResultRank
RetrievalMS MARCO (dev)
MRR@100.399
84
Information RetrievalMS MARCO DL2019
nDCG@1070
26
Information RetrievalMS MARCO DL2020
NDCG@1067.8
12
Showing 3 of 3 rows

Other info

Follow for update