Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Domain-Adaptive Pretraining Methods for Dialogue Understanding

About

Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.

Han Wu, Kun Xu, Linfeng Song, Lifeng Jin, Haisong Zhang, Linqi Song• 2021

Related benchmarks

TaskDatasetResultRank
Conversational Semantic Role LabelingDuConv
F1 (All)89.97
7
Conversational Semantic Role LabelingNewsDialog
F1 (all)81.9
7
Spoken Language UnderstandingCrossWOZ
Intent F196.97
7
Showing 3 of 3 rows

Other info

Follow for update