Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

An Effective Domain Adaptive Post-Training Method for BERT in Response Selection

About

We focus on multi-turn response selection in a retrieval-based dialog system. In this paper, we utilize the powerful pre-trained language model Bi-directional Encoder Representations from Transformer (BERT) for a multi-turn dialog system and propose a highly effective post-training method on domain-specific corpus. Although BERT is easily adopted to various NLP tasks and outperforms previous baselines of each task, it still has limitations if a task corpus is too focused on a certain domain. Post-training on domain-specific corpus (e.g., Ubuntu Corpus) helps the model to train contextualized representations and words that do not appear in general corpus (e.g., English Wikipedia). Experimental results show that our approach achieves new state-of-the-art on two response selection benchmarks (i.e., Ubuntu Corpus V1, Advising Corpus) performance improvement by 5.9% and 6% on R@1.

Taesun Whang, Dongyub Lee, Chanhee Lee, Kisu Yang, Dongsuk Oh, HeuiSeok Lim• 2019

Related benchmarks

TaskDatasetResultRank
Multi-turn Response SelectionUbuntu Dialogue Corpus V1 (test)
R10@186.7
102
Response SelectionDouban Conversation Corpus (test)
MAP0.612
94
Response SelectionE-commerce (test)
Recall@1 (R10)0.725
81
Multi-turn Response SelectionE-commerce Dialogue Corpus (test)
R@1 (Top 10 Set)61
70
Response SelectionUbuntu (test)
Recall@1 (Top 10)0.862
58
Multi-turn Response SelectionDouban (test)
MAP60.9
16
Multi-turn Response SelectionAdvising Corpus DSTC 7 (test)
Recall@1 (R=100)0.274
6
Showing 7 of 7 rows

Other info

Follow for update