Alternating Back-Propagation for Generator Network
About
This paper proposes an alternating back-propagation algorithm for learning the generator network model. The model is a non-linear generalization of factor analysis. In this model, the mapping from the continuous latent factors to the observed signal is parametrized by a convolutional neural network. The alternating back-propagation algorithm iterates the following two steps: (1) Inferential back-propagation, which infers the latent factors by Langevin dynamics or gradient descent. (2) Learning back-propagation, which updates the parameters given the inferred latent factors by gradient descent. The gradient computations in both steps are powered by back-propagation, and they share most of their code in common. We show that the alternating back-propagation algorithm can learn realistic generator models of natural images, video sequences, and sounds. Moreover, it can also be used to learn from incomplete or indirect training data.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| RGBD Saliency Detection | DES | S-measure0.94 | 102 | |
| RGBD Saliency Detection | NLPR | S-measure0.919 | 85 | |
| RGBD Saliency Detection | SSB | S-measure0.904 | 48 | |
| RGBD Saliency Detection | LFSD | S-measure0.866 | 43 | |
| RGBD Saliency Detection | NJU2K | S-measure0.9 | 42 | |
| RGBD Saliency Detection | SIP | S-measure0.876 | 38 | |
| Image Generation | SVHN (test) | FID49.71 | 14 | |
| Image Generation and Reconstruction | CelebA (test) | FID51.5 | 11 | |
| Image Generation and Reconstruction | CIFAR-10 (test) | MSE0.018 | 9 | |
| Anomaly Detection | MNIST Heldout Digit 1 (test) | AUPRC9.5 | 7 |