Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

NGBoost: Natural Gradient Boosting for Probabilistic Prediction

About

We present Natural Gradient Boosting (NGBoost), an algorithm for generic probabilistic prediction via gradient boosting. Typical regression models return a point estimate, conditional on covariates, but probabilistic regression models output a full probability distribution over the outcome space, conditional on the covariates. This allows for predictive uncertainty estimation -- crucial in applications like healthcare and weather forecasting. NGBoost generalizes gradient boosting to probabilistic regression by treating the parameters of the conditional distribution as targets for a multiparameter boosting algorithm. Furthermore, we show how the Natural Gradient is required to correct the training dynamics of our multiparameter boosting approach. NGBoost can be used with any base learner, any family of distributions with continuous parameters, and any scoring rule. NGBoost matches or exceeds the performance of existing methods for probabilistic prediction while offering additional benefits in flexibility, scalability, and usability. An open-source implementation is available at github.com/stanfordmlgroup/ngboost.

Tony Duan, Anand Avati, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Y. Ng, Alejandro Schuler• 2019

Related benchmarks

TaskDatasetResultRank
RegressionEnergy UCI (test)
RMSE1.952
33
RegressionBoston UCI (test)
RMSE3.13
32
RegressionConcrete UCI (test)
RMSE7.222
27
RegressionYacht UCI (test)
RMSE1.197
26
RegressionUCI KIN8NM (test)--
25
RegressionProtein (test)
Test Log Likelihood2.86
24
RegressionNaval UCI (test)
RMSE0.009
22
RegressionKin8nm UCI (test)
RMSE0.197
14
RegressionPROTEIN
NLL2.81
10
RegressionKin8nm
NLL-0.49
10
Showing 10 of 64 rows

Other info

Follow for update