Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

UniGarmentManip: A Unified Framework for Category-Level Garment Manipulation via Dense Visual Correspondence

About

Garment manipulation (e.g., unfolding, folding and hanging clothes) is essential for future robots to accomplish home-assistant tasks, while highly challenging due to the diversity of garment configurations, geometries and deformations. Although able to manipulate similar shaped garments in a certain task, previous works mostly have to design different policies for different tasks, could not generalize to garments with diverse geometries, and often rely heavily on human-annotated data. In this paper, we leverage the property that, garments in a certain category have similar structures, and then learn the topological dense (point-level) visual correspondence among garments in the category level with different deformations in the self-supervised manner. The topological correspondence can be easily adapted to the functional correspondence to guide the manipulation policies for various downstream tasks, within only one or few-shot demonstrations. Experiments over garments in 3 different categories on 3 representative tasks in diverse scenarios, using one or two arms, taking one or more steps, inputting flat or messy garments, demonstrate the effectiveness of our proposed method. Project page: https://warshallrho.github.io/unigarmentmanip.

Ruihai Wu, Haoran Lu, Yiyan Wang, Yubo Wang, Hao Dong• 2024

Related benchmarks

TaskDatasetResultRank
FoldingGarments Dress
Fold-FLAT84
3
FoldingGarments (Trouser)
Fold-FLAT83.3
3
HangingGarments Hang-RAND (test)
Top Success Rate81.9
3
HangingGarments (Hang-FLING) (test)
Top Success Rate83.8
3
UnfoldingCLOTH3D Unfold-RAND (test)
Success Rate (Tops)83.6
3
UnfoldingCLOTH3D Unfold-DROP (test)
Success Rate (Tops)85.3
3
FoldingGarments (Tops)
Fold-FLAT0.835
3
Showing 7 of 7 rows

Other info

Code

Follow for update