Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ECONET: Effective Continual Pretraining of Language Models for Event Temporal Reasoning

About

While pre-trained language models (PTLMs) have achieved noticeable success on many NLP tasks, they still struggle for tasks that require event temporal reasoning, which is essential for event-centric applications. We present a continual pre-training approach that equips PTLMs with targeted knowledge about event temporal relations. We design self-supervised learning objectives to recover masked-out event and temporal indicators and to discriminate sentences from their corrupted counterparts (where event or temporal indicators got replaced). By further pre-training a PTLM with these objectives jointly, we reinforce its attention to event and temporal information, yielding enhanced capability on event temporal reasoning. This effective continual pre-training framework for event temporal reasoning (ECONET) improves the PTLMs' fine-tuning performances across five relation extraction and question answering tasks and achieves new or on-par state-of-the-art performances in most of our downstream tasks.

Rujun Han, Xiang Ren, Nanyun Peng• 2020

Related benchmarks

TaskDatasetResultRank
Temporal Relation ClassificationTB-DENSE
F-score66.8
25
Event TEMPREL extractionMATRES
F1 Score79.3
24
Relation ExtractionMATRES
F1 Score0.793
10
Temporal Machine Reading ComprehensionTORQUE (test)
F1 Score76.3
8
Temporal relation extractionRED
F1 Score43.8
8
Temporal Commonsense ReasoningMCTACO (test)
F1 Score76.8
8
Showing 6 of 6 rows

Other info

Code

Follow for update