Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Variational Inference for Text Processing

About

Recent advances in neural variational inference have spawned a renaissance in deep latent variable models. In this paper we introduce a generic variational inference framework for generative and conditional models of text. While traditional variational methods derive an analytic approximation for the intractable distributions over latent variables, here we construct an inference network conditioned on the discrete text input to provide the variational distribution. We validate this framework on two very different text modelling applications, generative document modelling and supervised question answering. Our neural variational document model combines a continuous stochastic document representation with a bag-of-words generative model and achieves the lowest reported perplexities on two standard test corpora. The neural answer selection model employs a stochastic representation layer within an attention mechanism to extract the semantics between a question and answer pair. On two question answering benchmarks this model exceeds all previous published benchmarks.

Yishu Miao, Lei Yu, Phil Blunsom• 2015

Related benchmarks

TaskDatasetResultRank
Answer SelectionWikiQA (test)
MAP0.689
149
Topic Modeling20 Newsgroups (test)
Perplexity836
39
Topic Coherence20News
NPMI0.186
26
Topic Modeling20NG
NPMI0.08
23
Document ModelingRCV1 v2 (test)
Perplexity550
18
Document ModelingMXM song lyrics (test)
Perplexity345
11
Topic ModelingRCV1
Avg Topic Coherence0.07
8
Topic Coherence EvaluationGrolier (test)
C_P-0.1877
8
Topic Coherence Evaluation20Newsgroups (test)
C_P-0.2558
8
Topic Coherence EvaluationNYTimes (test)
C_P-0.413
8
Showing 10 of 10 rows

Other info

Follow for update