Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Temporal Common Sense Acquisition with Minimal Supervision

About

Temporal common sense (e.g., duration and frequency of events) is crucial for understanding natural language. However, its acquisition is challenging, partly because such information is often not expressed explicitly in text, and human annotation on such concepts is costly. This work proposes a novel sequence modeling approach that exploits explicit and implicit mentions of temporal common sense, extracted from a large corpus, to build TACOLM, a temporal common sense language model. Our method is shown to give quality predictions of various dimensions of temporal common sense (on UDST and a newly collected dataset from RealNews). It also produces representations of events for relevant tasks such as duration comparison, parent-child relations, event coreference and temporal QA (on TimeBank, HiEVE and MCTACO) that are better than using the standard BERT. Thus, it will be an important component of temporal NLP.

Ben Zhou, Qiang Ning, Daniel Khashabi, Dan Roth• 2020

Related benchmarks

TaskDatasetResultRank
Temporal Relation ClassificationTB-DENSE
F-score64.8
25
Event TEMPREL extractionMATRES
F1 Score70.9
24
Temporal relation extractionRED
F1 Score40.3
8
Temporal Commonsense ReasoningMCTACO (test)
F1 Score69.3
8
Temporal Machine Reading ComprehensionTORQUE (test)
F1 Score65.4
8
Showing 5 of 5 rows

Other info

Follow for update