Gradient-Based Kernel Dimension Reduction for Regression
This article proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive-definite kernels or reproducing kernel Hilbert spaces (RKHSs). The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called <italic>sufficient dimension reduction</italic>. The proposed method is based on an estimator for the gradient of the regression function considered for the feature vectors mapped into RKHSs. It is proved that the method is able to estimate the directions that achieve sufficient dimension reduction. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the distributions or the type of variables, and needs only eigendecomposition for estimating the projection matrix. The theoretical analysis shows that the estimator is consistent with certain rate under some conditions. The experimental results demonstrate that the proposed method successfully finds effective directions with efficient computation even for high-dimensional explanatory variables.
Year of publication: |
2014
|
---|---|
Authors: | Fukumizu, Kenji ; Leng, Chenlei |
Published in: |
Journal of the American Statistical Association. - Taylor & Francis Journals, ISSN 0162-1459. - Vol. 109.2014, 505, p. 359-370
|
Publisher: |
Taylor & Francis Journals |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
New Trends in Statistical Information Processing
Fukumizu, Kenji, (2003)
-
Parameter estimation for von Mises–Fisher distributions
Tanabe, Akihiro, (2007)
-
Fukumizu, Kenji, (2003)
- More ...