Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Image Transformer

About

Image generation has been successfully cast as an autoregressive sequence generation or transformation problem. Recent work has shown that self-attention is an effective way of modeling textual sequences. In this work, we generalize a recently proposed model architecture based on self-attention, the Transformer, to a sequence modeling formulation of image generation with a tractable likelihood. By restricting the self-attention mechanism to attend to local neighborhoods we significantly increase the size of images the model can process in practice, despite maintaining significantly larger receptive fields per layer than typical convolutional neural networks. While conceptually simple, our generative models significantly outperform the current state of the art in image generation on ImageNet, improving the best published negative log-likelihood on ImageNet from 3.83 to 3.77. We also present results on image super-resolution with a large magnification ratio, applying an encoder-decoder configuration of our architecture. In a human evaluation study, we find that images generated by our super-resolution model fool human observers three times more often than the previous state of the art.

Niki Parmar, Ashish Vaswani, Jakob Uszkoreit, {\L}ukasz Kaiser, Noam Shazeer, Alexander Ku, Dustin Tran• 2018

Related benchmarks

TaskDatasetResultRank
Long-range sequence modelingLong Range Arena (LRA)
Text Accuracy52.98
164
Density EstimationCIFAR-10 (test)
Bits/dim2.9
134
Density EstimationImageNet 32x32 (test)
Bits per Sub-pixel5.439
66
Generative ModelingCIFAR-10 (test)
NLL (bits/dim)2.9
62
Generative ModelingCIFAR-10
BPD2.9
46
Density EstimationCIFAR-10
bpd2.89
40
Image ModelingCIFAR-10 (test)
NLL (bits/dim)2.9
36
Unconditional Image GenerationCIFAR10
BPD2.9
33
Unconditional Image GenerationImageNet-32
BPD3.77
31
Generative ModelingImageNet 32x32 downsampled
Bits Per Dimension3.77
24
Showing 10 of 25 rows

Other info

Follow for update