New Robust Variable Selection Methods for Linear Regression Models
type="main" xml:id="sjos12057-abs-0001"> <title type="main">ABSTRACT</title>Motivated by an entropy inequality, we propose for the first time a penalized profile likelihood method for simultaneously selecting significant variables and estimating unknown coefficients in multiple linear regression models in this article. The new method is robust to outliers or errors with heavy tails and works well even for error with infinite variance. Our proposed approach outperforms the adaptive lasso in both theory and practice. It is observed from the simulation studies that (i) the new approach possesses higher probability of correctly selecting the exact model than the least absolute deviation lasso and the adaptively penalized composite quantile regression approach and (ii) exact model selection via our proposed approach is robust regardless of the error distribution. An application to a real dataset is also provided.
Year of publication: |
2014
|
---|---|
Authors: | Chen, Ziqi ; Tang, Man-Lai ; Gao, Wei ; Shi, Ning-Zhong |
Published in: |
Scandinavian Journal of Statistics. - Danish Society for Theoretical Statistics, ISSN 0303-6898. - Vol. 41.2014, 3, p. 725-741
|
Publisher: |
Danish Society for Theoretical Statistics Finnish Statistical Society Norwegian Statistical Association Swedish Statistical Association |
Saved in:
Saved in favorites
Similar items by person
-
Efficient semiparametric estimation via Cholesky decomposition for longitudinal data
Chen, Ziqi, (2011)
-
Nonparametric estimation of the log odds ratio for sparse data by kernel smoothing
Chen, Ziqi, (2011)
-
Unified generalized iterative scaling and its applications
Gao, Wei, (2010)
- More ...