Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Projected GANs Converge Faster

About

Generative Adversarial Networks (GANs) produce high-quality images but are challenging to train. They need careful regularization, vast amounts of compute, and expensive hyper-parameter sweeps. We make significant headway on these issues by projecting generated and real samples into a fixed, pretrained feature space. Motivated by the finding that the discriminator cannot fully exploit features from deeper layers of the pretrained model, we propose a more effective strategy that mixes features across channels and resolutions. Our Projected GAN improves image quality, sample efficiency, and convergence speed. It is further compatible with resolutions of up to one Megapixel and advances the state-of-the-art Fr\'echet Inception Distance (FID) on twenty-two benchmark datasets. Importantly, Projected GANs match the previously lowest FIDs up to 40 times faster, cutting the wall-clock time from 5 days to less than 3 hours given the same computational resources.

Axel Sauer, Kashyap Chitta, Jens M\"uller, Andreas Geiger• 2021

Related benchmarks

TaskDatasetResultRank
Image GenerationLSUN church
FID1.59
95
Image GenerationSTL-10
FID13.68
66
Image GenerationLSUN bedroom
FID1.52
56
Image GenerationFFHQ
FID3.39
52
Image GenerationCIFAR-10 unconditional (test)
FID8.48
39
Unconditional image synthesisFFHQ 256x256 (test)
FID3.08
31
Image GenerationObama 100-shot (train)
FID11.21
28
Image GenerationGrumpy cat 100-shot (train)
FID15.8
28
Image GenerationPanda 100-shot (train)
FID3.98
28
Image GenerationFFHQ
FID3.39
22
Showing 10 of 71 rows
...

Other info

Code

Follow for update