Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Saving Dense Retriever from Shortcut Dependency in Conversational Search

About

Conversational search (CS) needs a holistic understanding of conversational inputs to retrieve relevant passages. In this paper, we demonstrate the existence of a retrieval shortcut in CS, which causes models to retrieve passages solely relying on partial history while disregarding the latest question. With in-depth analysis, we first show that naively trained dense retrievers heavily exploit the shortcut and hence perform poorly when asked to answer history-independent questions. To build more robust models against shortcut dependency, we explore various hard negative mining strategies. Experimental results show that training with the model-based hard negatives effectively mitigates the dependency on the shortcut, significantly improving dense retrievers on recent CS benchmarks. In particular, our retriever outperforms the previous state-of-the-art model by 11.0 in Recall@10 on QReCC.

Sungdong Kim, Gangwoo Kim• 2022

Related benchmarks

TaskDatasetResultRank
Conversational RetrievalQReCC (test)
Recall@1069.8
43
Conversational Search RetrievalTopiOCQA (test)
MRR26.1
21
Showing 2 of 2 rows

Other info

Follow for update