Bootstrapping Neural Processes
About
Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Sim2Real Regression | Predator-Prey Real | Context Likelihood2.451 | 16 | |
| Sim2Real Regression | Predator-Prey Simulation | Context Likelihood253.7 | 16 | |
| Image Completion | CelebA | Context Likelihood (Avg)3.172 | 14 | |
| Likelihood Estimation | MovieLens 10k (test) | Context Likelihood-16.267 | 14 | |
| Geomagnetic map interpolation | A-InZ | RMSE1.931 | 7 | |
| Geomagnetic map interpolation | A-InX | RMSE2.363 | 7 | |
| Geomagnetic map interpolation | A-OutZ | RMSE2.525 | 7 |