Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

H-GAN: the power of GANs in your Hands

About

We present HandGAN (H-GAN), a cycle-consistent adversarial learning approach implementing multi-scale perceptual discriminators. It is designed to translate synthetic images of hands to the real domain. Synthetic hands provide complete ground-truth annotations, yet they are not representative of the target distribution of real-world data. We strive to provide the perfect blend of a realistic hand appearance with synthetic annotations. Relying on image-to-image translation, we improve the appearance of synthetic hands to approximate the statistical distribution underlying a collection of real images of hands. H-GAN tackles not only the cross-domain tone mapping but also structural differences in localized areas such as shading discontinuities. Results are evaluated on a qualitative and quantitative basis improving previous works. Furthermore, we relied on the hand classification task to claim our generated hands are statistically similar to the real domain of hands.

Sergiu Oprea, Giorgos Karvounas, Pablo Martinez-Gonzalez, Nikolaos Kyriazis, Sergio Orts-Escolano, Iason Oikonomidis, Alberto Garcia-Garcia, Aggeliki Tsoli, Jose Garcia-Rodriguez, Antonis Argyros• 2021

Related benchmarks

TaskDatasetResultRank
Hand Appearance RecoveryA2 → B (object-occluded to bare hand)
FID (Input)62.94
6
Hand Appearance RecoveryA1 → B marker-contained to bare hand
FID (Initial)93.02
6
Showing 2 of 2 rows

Other info

Follow for update