Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors
About
We propose Fast-and-Frugal Text-Graph (FnF-TG) Transformers, a Transformer-based framework that unifies textual and structural information for inductive link prediction in text-attributed knowledge graphs. We demonstrate that, by effectively encoding ego-graphs (1-hop neighbourhoods), we can reduce the reliance on resource-intensive textual encoders. This makes the model both fast at training and inference time, as well as frugal in terms of cost. We perform a comprehensive evaluation on three popular datasets and show that FnF-TG can achieve superior performance compared to previous state-of-the-art methods. We also extend inductive learning to a fully inductive setting, where relations don't rely on transductive (fixed) representations, as in previous work, but are a function of their textual description. Additionally, we introduce new variants of existing datasets, specifically designed to test the performance of models on unseen relations at inference time, thus offering a new test-bench for fully inductive link prediction.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Inductive Link Prediction | FB15k-237 inductive (test) | Hits@100.524 | 37 | |
| Inductive Link Prediction | WN18RR inductive (test) | MRR0.737 | 30 | |
| Inductive Link Prediction | Wikidata5M IND (test) | MRR0.799 | 13 | |
| Link Prediction | Wikidata5M (IND) | MRR0.761 | 12 | |
| Link Prediction | Wikidata5M IND FIR | MRR0.726 | 4 |