Coverage Guarantees for Pseudo-Calibrated Conformal Prediction under Distribution Shift
About
Conformal prediction (CP) offers distribution-free marginal coverage guarantees under an exchangeability assumption, but these guarantees can fail if the data distribution shifts. We analyze the use of pseudo-calibration as a tool to counter this performance loss under a bounded label-conditional covariate shift model. Using tools from domain adaptation, we derive a lower bound on target coverage in terms of the source-domain loss of the classifier and a Wasserstein measure of the shift. Using this result, we provide a method to design pseudo-calibrated sets that inflate the conformal threshold by a slack parameter to keep target coverage above a prescribed level. Finally, we propose a source-tuned pseudo-calibration algorithm that interpolates between hard pseudo-labels and randomized labels as a function of classifier uncertainty. Numerical experiments show that our bounds qualitatively track pseudo-calibration behavior and that the source-tuned scheme mitigates coverage degradation under distribution shift while maintaining nontrivial prediction set sizes.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Conformal Prediction | CIFAR-10 (test) | -- | 21 | |
| Conformal Prediction | MNIST (test) | -- | 12 | |
| Conformal Prediction | CIFAR-100 (test) | -- | 12 |