Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

BCNet: Learning Body and Cloth Shape from A Single Image

About

In this paper, we consider the problem to automatically reconstruct garment and body shapes from a single near-front view RGB image. To this end, we propose a layered garment representation on top of SMPL and novelly make the skinning weight of garment independent of the body mesh, which significantly improves the expression ability of our garment model. Compared with existing methods, our method can support more garment categories and recover more accurate geometry. To train our model, we construct two large scale datasets with ground truth body and garment geometries as well as paired color images. Compared with single mesh or non-parametric representation, our method can achieve more flexible control with separate meshes, makes applications like re-pose, garment transfer, and garment texture mapping possible. Code and some data is available at https://github.com/jby1993/BCNet.

Boyi Jiang, Juyong Zhang, Yang Hong, Jinhao Luo, Ligang Liu, Hujun Bao• 2020

Related benchmarks

TaskDatasetResultRank
3D clothed human reconstruction3DPW
Chamfer Distance118.8
6
3D Mesh ReconstructionBUFF (rough A-pose)
Upper Error1.07
5
Single-view 3D shape reconstruction3D garment dataset
Chamfer Dist.4.69
4
3D Garment ReconstructionSynthetic Sequence Female1
CD (cm)3.184
4
3D Garment ReconstructionSynthetic Sequence Female3
CD (cm)3.447
4
3D Garment ReconstructionSynthetic Sequence Male1
CD (cm)2.929
4
3D Garment ReconstructionSynthetic Sequence Male2
CD (cm)5.234
4
3D Mesh ReconstructionDigital Wardrobe
Upper Component Error1.44
3
3D clothed human reconstructionMSCOCO
Accuracy (Upper Body)41.5
3
Showing 9 of 9 rows

Other info

Code

Follow for update