Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DialBERT: A Hierarchical Pre-Trained Model for Conversation Disentanglement

About

Disentanglement is a problem in which multiple conversations occur in the same channel simultaneously, and the listener should decide which utterance is part of the conversation he will respond to. We propose a new model, named Dialogue BERT (DialBERT), which integrates local and global semantics in a single stream of messages to disentangle the conversations that mixed together. We employ BERT to capture the matching information in each utterance pair at the utterance-level, and use a BiLSTM to aggregate and incorporate the context-level information. With only a 3% increase in parameters, a 12% improvement has been attained in comparison to BERT, based on the F1-Score. The model achieves a state-of-the-art result on the a new dataset proposed by IBM and surpasses previous work by a substantial margin.

Tianda Li, Jia-Chen Gu, Xiaodan Zhu, Quan Liu, Zhen-Hua Ling, Zhiming Su, Si Wei• 2020

Related benchmarks

TaskDatasetResultRank
Dialogue DisentanglementUbuntu IRC (test)
VI93.2
17
Dialogue DisentanglementUbuntu IRC (dev)
VI0.941
9
Showing 2 of 2 rows

Other info

Follow for update