Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Cheap Bootstrap for Fast Uncertainty Quantification of Stochastic Gradient Descent

About

Stochastic gradient descent (SGD) or stochastic approximation has been widely used in model training and stochastic optimization. While there is a huge literature on analyzing its convergence, inference on the obtained solutions from SGD has only been recently studied, yet it is important due to the growing need for uncertainty quantification. We investigate two computationally cheap resampling-based methods to construct confidence intervals for SGD solutions. One uses multiple, but few, SGDs in parallel via resampling with replacement from the data, and another operates this in an online fashion. Our methods can be regarded as enhancements of established bootstrap schemes to substantially reduce the computation effort in terms of resampling requirements, while bypassing the intricate mixing conditions in existing batching methods. We achieve these via a recent so-called cheap bootstrap idea and refinement of a Berry-Esseen-type bound for SGD.

Henry Lam, Zitong Wang• 2023

Related benchmarks

TaskDatasetResultRank
Linear regressionToeplitz Σ n=10^5, d=5
Covariance95.84
15
Logistic RegressionToeplitz Σ n=10^5, d=5
Coverage (%)95.4
15
Logistic RegressionToeplitz Σ (n=10^5, d=20)
Coverage (%)94.81
15
Linear regressionToeplitz Σ (n=10^5, d=20)
Covariance (%)95.43
15
Logistic RegressionToeplitz Σ (n=10^5, d=200)
Coverage99.31
15
Linear regressionToeplitz Σ (n=10^5, d=200)
Covariance (%)95.78
15
Sparse Linear RegressionSparse Linear Regression (n=100, Toeplitz Σ, p=3)
Coverage99.96
8
Sparse Linear RegressionSparse Linear Regression n=100, Toeplitz Σ, p=15
Coverage99.98
8
Showing 8 of 8 rows

Other info

Follow for update