Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

On Investigating the Conservative Property of Score-Based Generative Models

About

Existing Score-Based Models (SBMs) can be categorized into constrained SBMs (CSBMs) or unconstrained SBMs (USBMs) according to their parameterization approaches. CSBMs model probability density functions as Boltzmann distributions, and assign their predictions as the negative gradients of some scalar-valued energy functions. On the other hand, USBMs employ flexible architectures capable of directly estimating scores without the need to explicitly model energy functions. In this paper, we demonstrate that the architectural constraints of CSBMs may limit their modeling ability. In addition, we show that USBMs' inability to preserve the property of conservativeness may lead to degraded performance in practice. To address the above issues, we propose Quasi-Conservative Score-Based Models (QCSBMs) for keeping the advantages of both CSBMs and USBMs. Our theoretical derivations demonstrate that the training objective of QCSBMs can be efficiently integrated into the training processes by leveraging the Hutchinson's trace estimator. In addition, our experimental results on the CIFAR-10, CIFAR-100, ImageNet, and SVHN datasets validate the effectiveness of QCSBMs. Finally, we justify the advantage of QCSBMs using an example of a one-layered autoencoder.

Chen-Hao Chao, Wei-Fang Sun, Bo-Wun Cheng, Chun-Yi Lee• 2022

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)
FID2.48
471
Generative ModelingCIFAR-10 (test)
NLL (bits/dim)3.38
62
Image GenerationCIFAR100
FID8.9
51
Image GenerationCIFAR-100 (test)
IS9.75
35
Image GenerationSVHN
FID15.15
20
Image GenerationImageNet-32
FID19.62
20
Image GenerationImageNet 32x32 (test)
FID19.62
15
Image GenerationSVHN (test)
FID13.88
14
Generative ModelingImageNet 32x32 (test)
NLL3.83
12
Image GenerationImageNet 32x32
FID16.62
11
Showing 10 of 13 rows

Other info

Code

Follow for update