Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Manifold Learning Benefits GANs

About

In this paper, we improve Generative Adversarial Networks by incorporating a manifold learning step into the discriminator. We consider locality-constrained linear and subspace-based manifolds, and locality-constrained non-linear manifolds. In our design, the manifold learning and coding steps are intertwined with layers of the discriminator, with the goal of attracting intermediate feature representations onto manifolds. We adaptively balance the discrepancy between feature representations and their manifold view, which is a trade-off between denoising on the manifold and refining the manifold. We find that locality-constrained non-linear manifolds outperform linear manifolds due to their non-uniform density and smoothness. We also substantially outperform state-of-the-art baselines.

Yao Ni, Piotr Koniusz, Richard Hartley, Richard Nock• 2021

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)--
471
Image GenerationImageNet 64x64 (train val)
FID4.26
83
Image GenerationImageNet 128x128--
51
Image GenerationCIFAR-100 (20% data)
IS13.78
41
Image GenerationCIFAR-100 (10% data)
Inception Score12.67
41
Image GenerationCIFAR-10 (20% data)
Inception Score10.12
35
Image GenerationCIFAR-10 (10% data)
Inception Score10.04
35
Image GenerationCIFAR-100 (full data)
Inception Score13.8
35
Image GenerationCIFAR-100 (test)
IS13.88
35
Image GenerationCIFAR10 (train)--
32
Showing 10 of 14 rows

Other info

Code

Follow for update