Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi$^2$OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT

About

In this paper, we propose Multi$^2$OIE, which performs open information extraction (open IE) by combining BERT with multi-head attention. Our model is a sequence-labeling system with an efficient and effective argument extraction method. We use a query, key, and value setting inspired by the Multimodal Transformer to replace the previously used bidirectional long short-term memory architecture with multi-head attention. Multi$^2$OIE outperforms existing sequence-labeling systems with high computational efficiency on two benchmark evaluation datasets, Re-OIE2016 and CaRB. Additionally, we apply the proposed method to multilingual open IE using multilingual BERT. Experimental results on new benchmark datasets introduced for two languages (Spanish and Portuguese) demonstrate that our model outperforms other multilingual systems without training data for the target languages.

Youngbin Ro, Yukyung Lee, Pilsung Kang• 2020

Related benchmarks

TaskDatasetResultRank
Open Information ExtractionCaRB (test)
F1 Score52.3
53
Open Information ExtractionRe-OIE 2016 (test)
AUC74.6
20
Open Information ExtractionCaRB-nary English
F1 Score52.3
10
Open Information ExtractionBenchIE binary English
F1 Score22.8
10
Open Information ExtractionCaRB Manual Subset (100 sentences) (val)
Precision78.6
6
Open Information ExtractionBenchIE
Precision39.2
6
Open Information ExtractionWire57 (test)
Precision (P)33.4
6
Binary Open Information ExtractionMultiOIE EN 2016 (test)
F1 Score69.3
5
Binary Open Information ExtractionMultiOIE ES 2016 (test)
F1 Score60.2
5
Binary Open Information ExtractionMultiOIE PT 2016 (test)
F1 Score59.1
5
Showing 10 of 19 rows

Other info

Code

Follow for update