Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MEKER: Memory Efficient Knowledge Embedding Representation for Link Prediction and Question Answering

About

Knowledge Graphs (KGs) are symbolically structured storages of facts. The KG embedding contains concise data used in NLP tasks requiring implicit information about the real world. Furthermore, the size of KGs that may be useful in actual NLP assignments is enormous, and creating embedding over it has memory cost issues. We represent KG as a 3rd-order binary tensor and move beyond the standard CP decomposition by using a data-specific generalized version of it. The generalization of the standard CP-ALS algorithm allows obtaining optimization gradients without a backpropagation mechanism. It reduces the memory needed in training while providing computational benefits. We propose a MEKER, a memory-efficient KG embedding model, which yields SOTA-comparable performance on link prediction tasks and KG-based Question Answering.

Viktoriia Chekalina, Anton Razzhigaev, Albert Sayapin, Evgeny Frolov, Alexander Panchenko• 2022

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)
Hits@1053.93
419
Link PredictionWN18RR (test)
Hits@1054.47
380
Link PredictionWikidata5M (test)
MRR0.211
58
Link PredictionWiki4M Russian (test)
MRR26.9
4
Knowledge Base Question AnsweringSimpleQuestions aligned with FB5M and Wikidata5m
1-Hop Accuracy61.81
3
Knowledge Graph Question AnsweringSimpleQuestions-Wikidata (Wiki4M)
F1 Score61.8
2
Showing 6 of 6 rows

Other info

Code

Follow for update