Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

KGTuner: Efficient Hyper-parameter Search for Knowledge Graph Learning

About

While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently. To solve this problem, we first analyze the properties of different HPs and measure the transfer ability from small subgraph to the full graph. Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage. Experiments show that our method can consistently find better HPs than the baseline algorithms within the same time budget, which achieves {9.1\%} average relative improvement for four embedding models on the large-scale KGs in open graph benchmark.

Yongqi Zhang, Zhanke Zhou, Quanming Yao, Yong Li• 2022

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237
MRR35.2
280
Knowledge Graph CompletionFB15k-237 (test)
MRR0.358
179
Knowledge Graph CompletionWN18RR (test)
MRR0.484
177
Link PredictionWN18RR--
175
Link Predictionogbl-wikikg2 (test)
MRR0.5222
95
Link Predictionogbl-wikikg2 (val)
MRR0.5397
87
Link Predictionogbl-biokg (test)
MRR0.8385
36
Link Predictionogbl-wikikg2 v1 (test)
MRR0.5222
28
Link Predictionogbl-biokg v1 (test)
MRR0.8385
15
Link Predictionogbl-biokg v1 (val)
MRR0.8394
15
Showing 10 of 10 rows

Other info

Code

Follow for update