Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Balanced Training for Sparse GANs

About

Over the past few years, there has been growing interest in developing larger and deeper neural networks, including deep generative models like generative adversarial networks (GANs). However, GANs typically come with high computational complexity, leading researchers to explore methods for reducing the training and inference costs. One such approach gaining popularity in supervised learning is dynamic sparse training (DST), which maintains good performance while enjoying excellent training efficiency. Despite its potential benefits, applying DST to GANs presents challenges due to the adversarial nature of the training process. In this paper, we propose a novel metric called the balance ratio (BR) to study the balance between the sparse generator and discriminator. We also introduce a new method called balanced dynamic sparse training (ADAPT), which seeks to control the BR during GAN training to achieve a good trade-off between performance and computational cost. Our proposed method shows promising results on multiple datasets, demonstrating its effectiveness.

Yite Wang, Jing Wu, Naira Hovakimyan, Ruoyu Sun• 2023

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)--
471
Image GenerationCIFAR-10
Inception Score9.1
178
Image SynthesisCIFAR-10
FID7.9
79
Image GenerationSTL-10
FID29.11
66
Generative Image SynthesisCIFAR-10 SNGAN
FID10.6
62
Generative Image SynthesisSTL-10 SNGAN
FID29.96
62
Generative Image SynthesisCIFAR-10 BigGAN
FID8.22
62
Image GenerationSTL-10 (test)
Inception Score9.29
59
Image GenerationTiny ImageNet (test)
IS15.77
35
Image GenerationTiny-ImageNet
Inception Score14.4
34
Showing 10 of 12 rows

Other info

Code

Follow for update