Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Discovering Discrete Latent Topics with Neural Variational Inference

About

Topic models have been widely explored as probabilistic generative models of documents. Traditional inference methods have sought closed-form derivations for updating the models, however as the expressiveness of these models grows, so does the difficulty of performing fast and accurate inference over their parameters. This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference. In addition, with the help of a stick-breaking construction, we propose a recurrent network that is able to discover a notionally unbounded number of topics, analogous to Bayesian non-parametric topic models. Experimental results on the MXM Song Lyrics, 20NewsGroups and Reuters News datasets demonstrate the effectiveness and efficiency of these neural topic models.

Yishu Miao, Edward Grefenstette, Phil Blunsom• 2017

Related benchmarks

TaskDatasetResultRank
Topic Modeling20 Newsgroups (test)
Perplexity785
39
Topic Coherence20News
NPMI0.224
26
Document ModelingRCV1 v2 (test)
Perplexity521
18
Topic ModelingAGNews
Diversity57.6
14
Topic ModelingMXM official (test)
Perplexity272
12
Topic ModelingRCV1-v2 processed (test)
Perplexity602
12
Document ModelingMXM song lyrics (test)
Perplexity267
11
Topic Coherence EvaluationGrolier (test)
C_P0.1974
8
Topic Coherence EvaluationNYTimes (test)
C_P0.3426
8
Topic Coherence Evaluation20Newsgroups (test)
C_P-0.2318
8
Showing 10 of 16 rows

Other info

Follow for update