Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Kernel dimension reduction in regression

About

We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate $X$ from the response $Y$, given the projection of $X$ on the central subspace [cf. J. Amer. Statist. Assoc. 86 (1991) 316--342 and Regression Graphics (1998) Wiley]. We show that this conditional independence assertion can be characterized in terms of conditional covariance operators on reproducing kernel Hilbert spaces and we show how this characterization leads to an $M$-estimator for the central subspace. The resulting estimator is shown to be consistent under weak conditions; in particular, we do not have to impose linearity or ellipticity conditions of the kinds that are generally invoked for SDR methods. We also present empirical results showing that the new methodology is competitive in practice.

Kenji Fukumizu, Francis R. Bach, Michael I. Jordan• 2009

Related benchmarks

TaskDatasetResultRank
ClassificationBreast Cancer (test)
Misclassification Rate27.14
16
ClassificationGerman (test)
Error Rate23.47
16
Binary ClassificationHeart (test)
Error Rate17.15
16
Binary ClassificationUCI Waveform (test)
Misclassification Rate12.54
16
Binary Classificationthyroid (test)
Misclassification Rate27.13
16
Binary Classificationflaresolar (test)
Error Rate36.73
16
Showing 6 of 6 rows

Other info

Follow for update