Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Particle-based Variational Inference with Generalized Wasserstein Gradient Flow

About

Particle-based variational inference methods (ParVIs) such as Stein variational gradient descent (SVGD) update the particles based on the kernelized Wasserstein gradient flow for the Kullback-Leibler (KL) divergence. However, the design of kernels is often non-trivial and can be restrictive for the flexibility of the method. Recent works show that functional gradient flow approximations with quadratic form regularization terms can improve performance. In this paper, we propose a ParVI framework, called generalized Wasserstein gradient descent (GWG), based on a generalized Wasserstein gradient flow of the KL divergence, which can be viewed as a functional gradient method with a broader class of regularizers induced by convex functions. We show that GWG exhibits strong convergence guarantees. We also provide an adaptive version that automatically chooses Wasserstein metric to accelerate convergence. In experiments, we demonstrate the effectiveness and efficiency of the proposed framework on both simulated and real data problems.

Ziheng Cheng, Shiyue Zhang, Longlin Yu, Cheng Zhang• 2023

Related benchmarks

TaskDatasetResultRank
Bayesian Neural NetworksUCI Boston (test)
RMSE2.721
10
Bayesian Neural NetworksUCI CONCRETE (test)
RMSE3.871
8
Bayesian Neural NetworksUCI POWER (test)
RMSE3.944
4
Bayesian Neural NetworksUCI WineWhite (test)
RMSE0.66
4
Bayesian Neural NetworksUCI WineRed (test)
RMSE0.575
4
Bayesian Neural NetworksUCI PROTEIN (test)
RMSE4.686
4
Showing 6 of 6 rows

Other info

Code

Follow for update