Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Processes

About

A neural network (NN) is a parameterised function that can be tuned via gradient descent to approximate a labelled collection of data with high precision. A Gaussian process (GP), on the other hand, is a probabilistic model that defines a distribution over possible functions, and is updated in light of data via the rules of probabilistic inference. GPs are probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability. We introduce a class of neural latent variable models which we call Neural Processes (NPs), combining the best of both worlds. Like GPs, NPs define distributions over functions, are capable of rapid adaptation to new observations, and can estimate the uncertainty in their predictions. Like NNs, NPs are computationally efficient during training and evaluation but also learn to adapt their priors to data. We demonstrate the performance of NPs on a range of learning tasks, including regression and optimisation, and compare and contrast with related models in the literature.

Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S.M. Ali Eslami, Yee Whye Teh• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationOffice-31 (test)
Avg Accuracy40.52
93
Episodic multi-task classificationOffice-Home meta (test)
Avg Accuracy53.99
36
Episodic multi-task classificationDomainNet meta (test)
Accuracy20.58
36
ClassificationS-CIFAR-10
Accuracy59.3
26
ClassificationS-CIFAR-100
Accuracy38.7
26
ClassificationP-MNIST
Accuracy79.44
23
Sim2Real RegressionPredator-Prey Real
Context Likelihood-2.489
16
1D RegressionSynthetic 1D Regression RBF kernel with noises
Context Likelihood-0.151
16
1D RegressionSynthetic 1D Regression RBF kernel
Context Likelihood0.122
16
1D RegressionSynthetic 1D Regression Matern kernel GP
Context Likelihood-0.228
16
Showing 10 of 19 rows

Other info

Follow for update