Dimension reduction and predictor selection in semiparametric models
Dimension reduction in semiparametric regressions includes construction of informative linear combinations and selection of contributing predictors. To reduce the predictor dimension in semiparametric regressions, we propose an ℓ<sub>1</sub>-minimization of sliced inverse regression with the Dantzig selector, and establish a non-asymptotic error bound for the resulting estimator. We also generalize the regularization concept to sliced inverse regression with an adaptive Dantzig selector. This ensures that all contributing predictors are selected with high probability, and that the resulting estimator is asymptotically normal even when the predictor dimension diverges to infinity. Numerical studies confirm our theoretical observations and demonstrate that our proposals are superior to existing estimators in terms of both dimension reduction and predictor selection. Copyright 2013, Oxford University Press.
Year of publication: |
2013
|
---|---|
Authors: | Yu, Zhou ; Zhu, Liping ; Peng, Heng ; Zhu, Lixing |
Published in: |
Biometrika. - Biometrika Trust, ISSN 0006-3444. - Vol. 100.2013, 3, p. 641-654
|
Publisher: |
Biometrika Trust |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Inference on the primary parameter of interest with the aid of dimension reduction estimation
Li, Lexin, (2011)
-
Sufficient dimension reduction through discretization-expectation estimation
Zhu, Liping, (2010)
-
Robust inverse regression for dimension reduction
Dong, Yuexiao, (2015)
- More ...