Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Temporal Credit Is Free

About

Recurrent networks do not need Jacobian propagation to adapt online. The hidden state already carries temporal credit through the forward pass; immediate derivatives suffice if you stop corrupting them with stale trace memory and normalize gradient scales across parameter groups. An architectural rule predicts when normalization is needed: \b{eta}2 is required when gradients must pass through a nonlinear state update with no output bypass, and unnecessary otherwise. Across ten architectures, real primate neural data, and streaming ML benchmarks, immediate derivatives with RMSprop match or exceed full RTRL, scaling to n = 1024 at 1000x less memory.

Aur Shalev Merin• 2026

Related benchmarks

TaskDatasetResultRank
Cross-session BCI decodingBCI
Recovery106
4
Online adaptation recoveryDelayed (t+50)
Recovery Percentage (t+50)179
4
Online adaptation recoverySine n=64
Recovery102
4
Online adaptation recoveryLorenz chaotic
Recovery Rate113
2
Language ModelingLanguage
Cross-Entropy Loss2.716
2
Online adaptation recoverySine n=1024
Recovery (%)378
1
Showing 6 of 6 rows

Other info

Follow for update