Minimax adaptive dimension reduction for regression
In this paper, we address the problem of regression estimation in the context of a p-dimensional predictor when p is large. We propose a general model in which the regression function is a composite function. Our model consists in a nonlinear extension of the usual sufficient dimension reduction setting. The strategy followed for estimating the regression function is based on the estimation of a new parameter, called the reduced dimension. We adopt a minimax point of view and provide both lower and upper bounds for the optimal rates of convergence for the estimation of the regression function in the context of our model. We prove that our estimate adapts, in the minimax sense, to the unknown value d of the reduced dimension and achieves therefore fast rates of convergence when d≪p.
Year of publication: |
2014
|
---|---|
Authors: | Paris, Quentin |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 128.2014, C, p. 186-202
|
Publisher: |
Elsevier |
Subject: | Regression estimation | Dimension reduction | Minimax rates of convergence | Empirical risk minimization | Metric entropy |
Saved in:
Saved in favorites
Similar items by subject
-
Biau, Gérard, (2012)
-
A note on global suprema of band-limited spherical random functions
Marinucci, Domenico, (2015)
-
On the Kernel Rule for Function Classification
Abraham, C., (2006)
- More ...
Similar items by person
-
Cadre, Benoît, (2012)
- More ...