Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gradient-based Discrete Sampling with Automatic Cyclical Scheduling

About

Discrete distributions, particularly in high-dimensional deep models, are often highly multimodal due to inherent discontinuities. While gradient-based discrete sampling has proven effective, it is susceptible to becoming trapped in local modes due to the gradient information. To tackle this challenge, we propose an automatic cyclical scheduling, designed for efficient and accurate sampling in multimodal discrete distributions. Our method contains three key components: (1) a cyclical step size schedule where large steps discover new modes and small steps exploit each mode; (2) a cyclical balancing schedule, ensuring "balanced" proposals for given step sizes and high efficiency of the Markov chain; and (3) an automatic tuning scheme for adjusting the hyperparameters in the cyclical schedules, allowing adaptability across diverse datasets with minimal tuning. We prove the non-asymptotic convergence and inference guarantee for our method in general discrete distributions. Extensive experiments demonstrate the superiority of our method in sampling complex multimodal discrete distributions.

Patrick Pynadath, Riddhiman Bhattacharya, Arun Hariharan, Ruqi Zhang• 2024

Related benchmarks

TaskDatasetResultRank
Conditional estimationDynamic MNIST (test)
Test Log Likelihood-79.634
18
Generative ModelingOmniglot (test)
Log Likelihood-91.487
8
RBM learningEMNIST (test)
Log Likelihood (AIS)-305
4
RBM learningCaltech Silhouettes (test)
Log Likelihood (AIS)-396
4
RBM learningMNIST (test)
Log Likelihood (AIS)-249.6
4
RBM learningKMNIST (test)
Log Likelihood (AIS)-407.4
4
EBM LearningStatic MNIST (test)
Log Likelihood-79.905
3
EBM LearningCaltech (test)
Log Likelihood-89.262
3
Text InfillingGrimm
Perplexity369.4
2
Text InfillingSST2
Perplexity307.1
2
Showing 10 of 10 rows

Other info

Code

Follow for update