Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Entity-Agnostic Representation Learning for Parameter-Efficient Knowledge Graph Embedding

About

We propose an entity-agnostic representation learning method for handling the problem of inefficient parameter storage costs brought by embedding knowledge graphs. Conventional knowledge graph embedding methods map elements in a knowledge graph, including entities and relations, into continuous vector spaces by assigning them one or multiple specific embeddings (i.e., vector representations). Thus the number of embedding parameters increases linearly as the growth of knowledge graphs. In our proposed model, Entity-Agnostic Representation Learning (EARL), we only learn the embeddings for a small set of entities and refer to them as reserved entities. To obtain the embeddings for the full set of entities, we encode their distinguishable information from their connected relations, k-nearest reserved entities, and multi-hop neighbors. We learn universal and entity-agnostic encoders for transforming distinguishable information into entity embeddings. This approach allows our proposed EARL to have a static, efficient, and lower parameter count than conventional knowledge graph embedding methods. Experimental results show that EARL uses fewer parameters and performs better on link prediction tasks than baselines, reflecting its parameter efficiency.

Mingyang Chen, Wen Zhang, Zhen Yao, Yushan Zhu, Yang Gao, Jeff Z. Pan, Huajun Chen• 2023

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237
MRR31
280
Link PredictionWN18RR
Hits@1052.7
175
Link PredictionYAGO3-10
MRR0.302
33
Link PredictionCoDEx-L
MRR0.238
5
Showing 4 of 4 rows

Other info

Follow for update