Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Looping Back to Move Forward: Recursive Transformers for Efficient and Flexible Large Multimodal Models

About

Large Multimodal Models (LMMs) have achieved remarkable success in vision-language tasks, yet their vast parameter counts are often underutilized during both training and inference. In this work, we embrace the idea of looping back to move forward: reusing model parameters through recursive refinement to extract stronger multimodal representations without increasing model size. We propose RecursiveVLM, a recursive Transformer architecture tailored for LMMs. Two key innovations enable effective looping: (i) a Recursive Connector that aligns features across recursion steps by fusing intermediate-layer hidden states and applying modality-specific projections, respecting the distinct statistical structures of vision and language tokens; (ii) a Monotonic Recursion Loss that supervises every step and guarantees performance improves monotonically with recursion depth. This design transforms recursion into an on-demand refinement mechanism: delivering strong results with few loops on resource-constrained devices and progressively improving outputs when more computation resources are available. Experiments show consistent gains of +3% over standard Transformers and +7% over vanilla recursive baselines, demonstrating that strategic looping is a powerful path toward efficient, deployment-adaptive LMMs.

Ruihan Xu, Yuting Gao, Lan Wang, Jianing Li, Weihao Chen, Qingpei Guo, Ming Yang, Shiliang Zhang• 2026

Related benchmarks

TaskDatasetResultRank
Mathematical ReasoningMathVista
Score58.5
322
Multimodal Capability EvaluationMM-Vet
Score65.64
282
Massive Multi-discipline Multimodal UnderstandingMMMU--
88
Multimodal UnderstandingMMB
Score76.63
30
Multimodal Hallucination EvaluationHallusionBench
Hallucination Score47.22
14
Complex Multimodal ReasoningMM-Star
Reasoning Score55.44
10
OCR RobustnessOCR Bench
Score83.7
10
Showing 7 of 7 rows

Other info

Follow for update