Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Stochastic Variational Inference

About

We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can easily handle data sets of this size and outperforms traditional variational inference, which can only handle a smaller subset. (We also show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to massive data sets.

Matt Hoffman, David M. Blei, Chong Wang, John Paisley• 2012

Related benchmarks

TaskDatasetResultRank
Neural Network InferenceARM Embedded Processors inference latency
Latency (ms)306.9
56
RegressionYacht
RMSE10.45
49
Language ModelingYahoo (test)
NLL329.8
48
RegressionUCI ENERGY (test)
Negative Log Likelihood5.38
42
RegressionUCI CONCRETE (test)
Neg Log Likelihood5.63
37
RegressionUCI YACHT (test)
Negative Log Likelihood3.77
33
Image ModelingOmniglot (test)
NLL90.65
27
RegressionUCI KIN8NM (test)
NLL-0.59
25
RegressionUCI WINE (test)
Negative Log Likelihood3.51
24
RegressionUCI NAVAL (test)
Negative Log Likelihood8.27
21
Showing 10 of 65 rows

Other info

Follow for update