Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Transformer Neural Processes: Uncertainty-Aware Meta Learning Via Sequence Modeling

About

Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to Gaussian Processes (GPs), NPs define distributions over functions and can estimate uncertainty in their predictions. However, unlike GPs, NPs and their variants suffer from underfitting and often have intractable likelihoods, which limit their applications in sequential decision making. We propose Transformer Neural Processes (TNPs), a new member of the NP family that casts uncertainty-aware meta learning as a sequence modeling problem. We learn TNPs via an autoregressive likelihood-based objective and instantiate it with a novel transformer-based architecture. The model architecture respects the inductive biases inherent to the problem structure, such as invariance to the observed data points and equivariance to the unobserved points. We further investigate knobs within the TNP framework that tradeoff expressivity of the decoding distribution with extra computation. Empirically, we show that TNPs achieve state-of-the-art performance on various benchmark problems, outperforming all previous NP variants on meta regression, image completion, contextual multi-armed bandits, and Bayesian optimization.

Tung Nguyen, Aditya Grover• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationOffice-31 (test)
Avg Accuracy69.69
93
Episodic multi-task classificationOffice-Home meta (test)
Avg Accuracy78.94
36
Episodic multi-task classificationDomainNet meta (test)
Accuracy67.39
36
Regressionelevators (test)--
19
RegressionProtein (test)
Test Log Likelihood-1.152
18
RegressionSkillcraft (test)
Log Likelihood (Test)-0.954
17
InterpolationHADISD Interp (test)
Log-Likelihood0.031
11
Regression1D GP (test)
Log-Likelihood0.767
11
ForecastingHADISD Forecast (test)
Log-Likelihood0.035
11
RegressionTabular Synthetic (test)
Log-Likelihood0.154
10
Showing 10 of 19 rows

Other info

Follow for update