Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Less is More: Recursive Reasoning with Tiny Networks

About

Hierarchical Reasoning Model (HRM) is a novel approach using two small neural networks recursing at different frequencies. This biologically inspired method beats Large Language models (LLMs) on hard puzzle tasks such as Sudoku, Maze, and ARC-AGI while trained with small models (27M parameters) on small data (around 1000 examples). HRM holds great promise for solving hard problems with small networks, but it is not yet well understood and may be suboptimal. We propose Tiny Recursive Model (TRM), a much simpler recursive reasoning approach that achieves significantly higher generalization than HRM, while using a single tiny network with only 2 layers. With only 7M parameters, TRM obtains 45% test-accuracy on ARC-AGI-1 and 8% on ARC-AGI-2, higher than most LLMs (e.g., Deepseek R1, o3-mini, Gemini 2.5 Pro) with less than 0.01% of the parameters.

Alexia Jolicoeur-Martineau• 2025

Related benchmarks

TaskDatasetResultRank
Abstract Visual ReasoningARC-AGI 1
Accuracy (Pass@2)44.6
15
Abstract Visual ReasoningARC-AGI 2
Accuracy (Pass@2)7.8
14
Visual ReasoningARC 1.0 (test)
Accuracy44.6
9
Visual ReasoningARC-2 1.0 (test)
Accuracy7.8
7
Showing 4 of 4 rows

Other info

Follow for update