Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PALT: Parameter-Lite Transfer of Language Models for Knowledge Graph Completion

About

This paper presents a parameter-lite transfer learning approach of pretrained language models (LM) for knowledge graph (KG) completion. Instead of finetuning, which modifies all LM parameters, we only tune a few new parameters while keeping the original LM parameters fixed. We establish this via reformulating KG completion as a "fill-in-the-blank" task, and introducing a parameter-lite encoder on top of the original LMs. We show that, by tuning far fewer parameters than finetuning, LMs transfer non-trivially to most tasks and reach competitiveness with prior state-of-the-art approaches. For instance, we outperform the fully finetuning approaches on a KG completion benchmark by tuning only 1% of the parameters. The code and datasets are available at \url{https://github.com/yuanyehome/PALT}.

Jianhao Shen, Chenguang Wang, Ye Yuan, Jiawei Han, Heng Ji, Koushik Sen, Ming Zhang, Dawn Song• 2022

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237--
280
Link PredictionWN18RR
Hits@1069.3
175
Link PredictionUMLS
Hits@1099
56
Triple classificationWN11 (test)
Accuracy93.8
55
Triple classificationFB13 (test)
Accuracy91.7
55
Showing 5 of 5 rows

Other info

Code

Follow for update