Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Product Kernel Interpolation for Scalable Gaussian Processes

About

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM based learning that exploits product kernel structure. We demonstrate that this technique is broadly applicable, resulting in linear rather than exponential runtime with dimension for SKI, as well as state-of-the-art asymptotic complexity for multi-task GPs.

Jacob R. Gardner, Geoff Pleiss, Ruihan Wu, Kilian Q. Weinberger, Andrew Gordon Wilson• 2018

Related benchmarks

TaskDatasetResultRank
RegressionEnergy UCI (test)
RMSE5.762
27
RegressionConcrete UCI (test)
RMSE12.727
21
RegressionUCI Kin40k d=8 (test)
RMSE0.174
5
RegressionUCI Fertility d=9 (test)
RMSE0.183
5
RegressionUCI Solar d=10 (test)
RMSE0.78
5
RegressionUCI Pendulum d=9 (test)
RMSE2.947
5
RegressionUCI Protein d=9 (test)
RMSE0.778
5
Showing 7 of 7 rows

Other info

Follow for update