Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Context-Aware Transformer Pre-Training for Answer Sentence Selection

About

Answer Sentence Selection (AS2) is a core component for building an accurate Question Answering pipeline. AS2 models rank a set of candidate sentences based on how likely they answer a given question. The state of the art in AS2 exploits pre-trained transformers by transferring them on large annotated datasets, while using local contextual information around the candidate sentence. In this paper, we propose three pre-training objectives designed to mimic the downstream fine-tuning task of contextual AS2. This allows for specializing LMs when fine-tuning for contextual AS2. Our experiments on three public and two large-scale industrial datasets show that our pre-training approaches (applied to RoBERTa and ELECTRA) can improve baseline contextual AS2 accuracy by up to 8% on some datasets.

Luca Di Liello, Siddhant Garg, Alessandro Moschitti• 2023

Related benchmarks

TaskDatasetResultRank
Answer Sentence SelectionWikiQA
P@185.2
36
Answer Sentence SelectionASNQ
P@170.5
24
Answer Sentence SelectionNewsAS2
MAP83
12
Answer Sentence SelectionIQAD Bench 1
MAP1.7
11
Answer Sentence SelectionIQAD Bench 2
MAP0.014
11
Showing 5 of 5 rows

Other info

Follow for update