Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

RouteRAG: Efficient Retrieval-Augmented Generation from Text and Graph via Reinforcement Learning

About

Retrieval-Augmented Generation (RAG) integrates non-parametric knowledge into Large Language Models (LLMs), typically from unstructured texts and structured graphs. While recent progress has advanced text-based RAG to multi-turn reasoning through Reinforcement Learning (RL), extending these advances to hybrid retrieval introduces additional challenges. Existing graph-based or hybrid systems typically depend on fixed or handcrafted retrieval pipelines, lacking the ability to integrate supplementary evidence as reasoning unfolds. Besides, while graph evidence provides relational structures crucial for multi-hop reasoning, it is substantially more expensive to retrieve. To address these limitations, we introduce \model{}, an RL-based framework that enables LLMs to perform multi-turn and adaptive graph-text hybrid RAG. \model{} jointly optimizes the entire generation process via RL, allowing the model to learn when to reason, what to retrieve from either texts or graphs, and when to produce final answers, all within a unified generation policy. To guide this learning process, we design a two-stage training framework that accounts for both task outcome and retrieval efficiency, enabling the model to exploit hybrid evidence while avoiding unnecessary retrieval overhead. Experimental results across five question answering benchmarks demonstrate that \model{} significantly outperforms existing RAG baselines, highlighting the benefits of end-to-end RL in supporting adaptive and efficient retrieval for complex reasoning.

Yucan Guo, Miao Su, Saiping Guan, Zihao Sun, Xiaolong Jin, Jiafeng Guo, Xueqi Cheng• 2025

Related benchmarks

TaskDatasetResultRank
Question AnsweringHotpotQA
F172.5
114
Question AnsweringMuSiQue
EM39.6
84
Question AnsweringPopQA
EM50.6
80
Question Answering2Wiki
F164.6
75
Question AnsweringAverage (PopQA, NQ, HotpotQA, 2Wiki, MuSiQue)
EM0.519
17
Showing 5 of 5 rows

Other info

Follow for update