Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Graph Fusion Across Languages using Large Language Models

About

Combining multiple knowledge graphs (KGs) across linguistic boundaries is a persistent challenge due to semantic heterogeneity and the complexity of graph environments. We propose a framework for cross-lingual graph fusion, leveraging the in-context reasoning and multilingual semantic priors of Large Language Models (LLMs). The framework implements structural linearization by mapping triplets directly into natural language sequences (e.g., [head] [relation] [tail]), enabling the LLM to map relations and reconcile entities between an evolving fused graph ($G_{c}^{(t-1)}$) and a new candidate graph ($G_{t}$). Evaluated on the DBP15K dataset, this exploratory study demonstrates that LLMs can serve as a universal semantic bridge to resolve cross-lingual discrepancies. Results show the successful sequential agglomeration of multiple heterogeneous graphs, offering a scalable, modular solution for continuous knowledge synthesis in multi-source, multilingual environments.

Kaung Myat Kyaw, Khush Agarwal, Jonathan Chan• 2026

Related benchmarks

TaskDatasetResultRank
Cross-lingual Entity AlignmentDBP15K zh_en Full (train test)
Hits3.54e+3
3
Cross-lingual Entity AlignmentDBP15K Chinese-English (full)
Total Batch Pairs768
1
Showing 2 of 2 rows

Other info

Follow for update