Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling

About

Knowledge graphs (KGs) that modelings the world knowledge as structural triples are inevitably incomplete. Such problems still exist for multimodal knowledge graphs (MMKGs). Thus, knowledge graph completion (KGC) is of great importance to predict the missing triples in the existing KGs. As for the existing KGC methods, embedding-based methods rely on manual design to leverage multimodal information while finetune-based approaches are not superior to embedding-based methods in link prediction. To address these problems, we propose a VisualBERT-enhanced Knowledge Graph Completion model (VBKGC for short). VBKGC could capture deeply fused multimodal information for entities and integrate them into the KGC model. Besides, we achieve the co-design of the KGC model and negative sampling by designing a new negative sampling strategy called twins negative sampling. Twins negative sampling is suitable for multimodal scenarios and could align different embeddings for entities. We conduct extensive experiments to show the outstanding performance of VBKGC on the link prediction task and make further exploration of VBKGC.

Yichi Zhang, Wen Zhang• 2022

Related benchmarks

TaskDatasetResultRank
Knowledge Graph CompletionMKG-Y
MRR37.04
22
Knowledge Graph CompletionDB15K
MRR30.61
22
Knowledge Graph CompletionOverall DB15K, MKG-W, MKG-Y
MRR32.75
22
Knowledge Graph CompletionMKG-W
MRR0.3061
22
Showing 4 of 4 rows

Other info

Follow for update