Variational Message Passing with Structured Inference Networks
About
Recent efforts on combining deep models with probabilistic graphical models are promising in providing flexible models that are also easy to interpret. We propose a variational message-passing algorithm for variational inference in such models. We make three contributions. First, we propose structured inference networks that incorporate the structure of the graphical model in the inference network of variational auto-encoders (VAE). Second, we establish conditions under which such inference networks enable fast amortized inference similar to VAE. Finally, we derive a variational message passing algorithm to perform efficient natural-gradient inference while retaining the efficiency of the amortized inference. By simultaneously enabling structured, amortized, and natural-gradient inference for deep structured models, our method simplifies and generalizes existing methods.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Interpolation | Human Motion Capture h3.6m | FID (0.0-0.8)12.38 | 10 | |
| Generative Modeling | Human Motion Capture h3.6m | Log Likelihood2.36 | 10 | |
| Generative Modeling | WSJ0 Audio Spectrogram | Log P(x)1.54 | 10 | |
| Interpolation | WSJ0 Audio Spectrogram | Interpolation FID (0.0-0.8)17.2 | 10 |