Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Temporal Straightening for Latent Planning

About

Learning good representations is essential for latent planning with world models. While pretrained visual encoders produce strong semantic visual features, they are not tailored to planning and contain information irrelevant -- or even detrimental -- to planning. Inspired by the perceptual straightening hypothesis in human visual processing, we introduce temporal straightening to improve representation learning for latent planning. Using a curvature regularizer that encourages locally straightened latent trajectories, we jointly learn an encoder and a predictor. We show that reducing curvature this way makes the Euclidean distance in latent space a better proxy for the geodesic distance and improves the conditioning of the planning objective. We demonstrate empirically that temporal straightening makes gradient-based planning more stable and yields significantly higher success rates across a suite of goal-reaching tasks.

Ying Wang, Oumayma Bounou, Gaoyue Zhou, Randall Balestriero, Tim G. J. Rudner, Yann LeCun, Mengye Ren• 2026

Related benchmarks

TaskDatasetResultRank
Goal-reaching PlanningPointMaze Medium Long-Horizon (50 steps)
Success Rate99.33
30
Goal ReachingWall 50 samples (test)
Success Rate100
20
Goal ReachingPointMaze UMaze (test)
Success Rate100
20
Goal ReachingPushT 50 Samples (test)
Success Rate91.33
20
Goal-reaching PlanningPushT Long-Horizon (50 steps)
Success Rate33.33
10
Showing 5 of 5 rows

Other info

Follow for update