Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

HyperRAG: Reasoning N-ary Facts over Hypergraphs for Retrieval Augmented Generation

About

Graph-based retrieval-augmented generation (RAG) methods, typically built on knowledge graphs (KGs) with binary relational facts, have shown promise in multi-hop open-domain QA. However, their rigid retrieval schemes and dense similarity search often introduce irrelevant context, increase computational overhead, and limit relational expressiveness. In contrast, n-ary hypergraphs encode higher-order relational facts that capture richer inter-entity dependencies and enable shallower, more efficient reasoning paths. To address this limitation, we propose HyperRAG, a RAG framework tailored for n-ary hypergraphs with two complementary retrieval variants: (i) HyperRetriever learns structural-semantic reasoning over n-ary facts to construct query-conditioned relational chains. It enables accurate factual tracking, adaptive high-order traversal, and interpretable multi-hop reasoning under context constraints. (ii) HyperMemory leverages the LLM's parametric memory to guide beam search, dynamically scoring n-ary facts and entities for query-aware path expansion. Extensive evaluations on WikiTopics (11 closed-domain datasets) and three open-domain QA benchmarks (HotpotQA, MuSiQue, and 2WikiMultiHopQA) validate HyperRAG's effectiveness. HyperRetriever achieves the highest answer accuracy overall, with average gains of 2.95% in MRR and 1.23% in Hits@10 over the strongest baseline. Qualitative analysis further shows that HyperRetriever bridges reasoning gaps through adaptive and interpretable n-ary chain construction, benefiting both open and closed-domain QA.

Wen-Sheng Lien, Yu-Kai Chan, Hao-Lung Hsiao, Bo-Kai Ruan, Meng-Fen Chiang, Chien-An Chen, Yi-Ren Yeh, Hong-Han Shuai• 2026

Related benchmarks

TaskDatasetResultRank
Multi-hop Question Answering2WikiMultihopQA
EM34
278
Multi-hop Question AnsweringHotpotQA
F143.65
79
Closed-domain Question AnsweringWikiTopics-CLQA ART
MRR19.31
6
Closed-domain Question AnsweringWikiTopics-CLQA AWARD
MRR0.5266
6
Closed-domain Question AnsweringWikiTopics-CLQA EDU
MRR44.79
6
Closed-domain Question AnsweringWikiTopics-CLQA HEALTH
MRR0.3268
6
Closed-domain Question AnsweringWikiTopics-CLQA INFRA
MRR0.3892
6
Closed-domain Question AnsweringWikiTopics-CLQA LOC
MRR31.8
6
Closed-domain Question AnsweringWikiTopics-CLQA PEOPLE
MRR21.62
6
Closed-domain Question AnsweringWikiTopics-CLQA SPORT
MRR39.37
6
Showing 10 of 14 rows

Other info

Follow for update