Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Hierarchical Diffusion Motion Planning with Task-Conditioned Uncertainty-Aware Priors

About

We propose a novel hierarchical diffusion planner that embeds task and motion structure directly into the noise model. Unlike standard diffusion-based planners that rely on zero-mean, isotropic Gaussian corruption, we introduce task-conditioned structured Gaussians whose means and covariances are derived from Gaussian Process Motion Planning (GPMP), explicitly encoding trajectory smoothness and task semantics in the prior. We first generalize the standard diffusion process to biased, non-isotropic corruption with closed-form forward and posterior expressions. Building on this formulation, our hierarchical design separates prior instantiation from trajectory denoising. At the upper level, the model predicts sparse, task-centric key states and their associated timings, which instantiate a structured Gaussian prior (mean and covariance). At the lower level, the full trajectory is denoised under this fixed prior, treating the upper-level outputs as noisy observations. Experiments on Maze2D goal-reaching and KUKA block stacking show consistently higher success rates and smoother trajectories than isotropic baselines, achieving dataset-level smoothness substantially earlier during training. Ablation studies further show that explicitly structuring the corruption process provides benefits beyond neural conditioning the denoising network alone. Overall, our approach concentrates the prior's probability mass near feasible and semantically meaningful trajectories. Our project page is available at https://hta-diffusion.github.io.

Amelie Minji Kim, Anqi Wu, Ye Zhao• 2025

Related benchmarks

TaskDatasetResultRank
Success RateMaze2D 100 samples
Success Rate81
4
Success RateKUKA Stacking 100 samples
Success Rate (KUKA Stacking 100)70
4
Showing 2 of 2 rows

Other info

Follow for update