Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Wear-Any-Way: Manipulable Virtual Try-on via Sparse Correspondence Alignment

About

This paper introduces a novel framework for virtual try-on, termed Wear-Any-Way. Different from previous methods, Wear-Any-Way is a customizable solution. Besides generating high-fidelity results, our method supports users to precisely manipulate the wearing style. To achieve this goal, we first construct a strong pipeline for standard virtual try-on, supporting single/multiple garment try-on and model-to-model settings in complicated scenarios. To make it manipulable, we propose sparse correspondence alignment which involves point-based control to guide the generation for specific locations. With this design, Wear-Any-Way gets state-of-the-art performance for the standard setting and provides a novel interaction form for customizing the wearing style. For instance, it supports users to drag the sleeve to make it rolled up, drag the coat to make it open, and utilize clicks to control the style of tuck, etc. Wear-Any-Way enables more liberated and flexible expressions of the attires, holding profound implications in the fashion industry.

Mengting Chen, Xi Chen, Zhonghua Zhai, Chen Ju, Xuewen Hong, Jinsong Lan, Shuai Xiao• 2024

Related benchmarks

TaskDatasetResultRank
Virtual Try-OnVITON-HD (test)
SSIM87.7
48
Image-based Virtual Try-OnVITON-HD paired unpaired
SSIM0.877
11
Multi-garment virtual try-onDressCode multiple
FID21.11
4
Showing 3 of 3 rows

Other info

Follow for update