Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Infusing Hierarchical Guidance into Prompt Tuning: A Parameter-Efficient Framework for Multi-level Implicit Discourse Relation Recognition

About

Multi-level implicit discourse relation recognition (MIDRR) aims at identifying hierarchical discourse relations among arguments. Previous methods achieve the promotion through fine-tuning PLMs. However, due to the data scarcity and the task gap, the pre-trained feature space cannot be accurately tuned to the task-specific space, which even aggravates the collapse of the vanilla space. Besides, the comprehension of hierarchical semantics for MIDRR makes the conversion much harder. In this paper, we propose a prompt-based Parameter-Efficient Multi-level IDRR (PEMI) framework to solve the above problems. First, we leverage parameter-efficient prompt tuning to drive the inputted arguments to match the pre-trained space and realize the approximation with few parameters. Furthermore, we propose a hierarchical label refining (HLR) method for the prompt verbalizer to deeply integrate hierarchical guidance into the prompt tuning. Finally, our model achieves comparable results on PDTB 2.0 and 3.0 using about 0.1% trainable parameters compared with baselines and the visualization demonstrates the effectiveness of our HLR method.

Haodong Zhao, Ruifang He, Mengnan Xiao, Jing Xu• 2024

Related benchmarks

TaskDatasetResultRank
Top-level Implicit Discourse Relation RecognitionPDTB 3.0 (test)
Macro F169.06
7
Implicit Discourse Relation RecognitionPDTB 3.0--
7
Connective PredictionPDTB 3.0 (test)
Macro-F110.52
2
Showing 3 of 3 rows

Other info

Code

Follow for update