Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Revisiting differentially private linear regression: optimal and adaptive prediction & estimation in unbounded domain

About

We revisit the problem of linear regression under a differential privacy constraint. By consolidating existing pieces in the literature, we clarify the correct dependence of the feature, label and coefficient domains in the optimization error and estimation error, hence revealing the delicate price of differential privacy in statistical estimation and statistical learning. Moreover, we propose simple modifications of two existing DP algorithms: (a) posterior sampling, (b) sufficient statistics perturbation, and show that they can be upgraded into **adaptive** algorithms that are able to exploit data-dependent quantities and behave nearly optimally **for every instance**. Extensive experiments are conducted on both simulated data and real data, which conclude that both AdaOPS and AdaSSP outperform the existing techniques on nearly all 36 data sets that we test on.

Yu-Xiang Wang• 2018

Related benchmarks

TaskDatasetResultRank
RegressionD3
Average Relative MSE0.669
11
RegressionD5
Average Relative MSE0.08
11
RegressionD2
Average Relative MSE0.229
10
RegressionD1
Average Relative MSE0.69
10
Linear regressionUCI D8
Relative MSE0.5
4
Linear regressionUCI D7
Relative MSE0.682
4
Linear regressionUCI D4
Relative MSE0.082
4
Linear regressionUCI D6
Relative MSE0.203
4
Linear regressionUCI D9
Relative MSE0.429
4
Showing 9 of 9 rows

Other info

Follow for update