Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs

About

Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which largely ignore the graph structure during encoding and lack elaborate pre-training tasks to explicitly model graph-text alignments. To tackle these problems, we propose a graph-text joint representation learning model called JointGT. During encoding, we devise a structure-aware semantic aggregation module which is plugged into each Transformer layer to preserve the graph structure. Furthermore, we propose three new pre-training tasks to explicitly enhance the graph-text alignment including respective text / graph reconstruction, and graph-text alignment in the embedding space via Optimal Transport. Experiments show that JointGT obtains new state-of-the-art performance on various KG-to-text datasets.

Pei Ke, Haozhe Ji, Yu Ran, Xin Cui, Liwei Wang, Linfeng Song, Xiaoyan Zhu, Minlie Huang• 2021

Related benchmarks

TaskDatasetResultRank
Knowledge Base Question GenerationWEBQUESTIONS (test)
METEOR32.05
12
Knowledge Base Question GenerationPathQuestions (test)
METEOR48.25
12
Graph-to-TextWebNLG v2.0 (test)
BLEU65.92
9
RDF-to-Text GenerationWebNLG Unconstrained 2.0 (test)
BLEU0.6614
6
RDF-to-Text GenerationWebNLG Constrained 2.0 (test)
BLEU61.01
5
Data-to-text generationWebNLG KG
BLEU (Unigram)37.2
5
Graph-to-text generationEventNarrative (test)
BLEU31.19
5
KG-to-text generationWebNLG U (test)
Fluency Win Rate29
2
Showing 8 of 8 rows

Other info

Code

Follow for update