Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Implicit Discourse Relation Classification via Multi-Task Neural Networks

About

Without discourse connectives, classifying implicit discourse relations is a challenging task and a bottleneck for building a practical discourse parser. Previous research usually makes use of one kind of discourse framework such as PDTB or RST to improve the classification performance on discourse relations. Actually, under different discourse annotation frameworks, there exist multiple corpora which have internal connections. To exploit the combination of different discourse corpora, we design related discourse classification tasks specific to a corpus, and propose a novel Convolutional Neural Network embedded multi-task learning system to synthesize these tasks by learning both unique and shared representations for each task. The experimental results on the PDTB implicit discourse relation classification task demonstrate that our model achieves significant gains over baseline systems.

Yang Liu, Sujian Li, Xiaodong Zhang, Zhifang Sui• 2016

Related benchmarks

TaskDatasetResultRank
Top-level Implicit Discourse Relation RecognitionPDTB 2.0 (Ji split)
F1 Score63.39
61
Second-level Implicit Discourse Relation RecognitionPDTB 2.0 (Ji split)
Accuracy58.13
54
Second-level Implicit Discourse Relation RecognitionPDTB 2.0 (Lin split)
Accuracy53.96
41
Top-level Implicit Discourse Relation RecognitionPDTB Lin split 2.0
F158.54
32
Showing 4 of 4 rows

Other info

Follow for update