Model Selection via Bayesian Information Criterion for Quantile Regression Models
Bayesian information criterion (BIC) is known to identify the true model consistently as long as the predictor dimension is finite. Recently, its moderate modifications have been shown to be consistent in model selection even when the number of variables diverges. Those works have been done mostly in mean regression, but rarely in quantile regression. The best-known results about BIC for quantile regression are for linear models with a fixed number of variables. In this article, we investigate how BIC can be adapted to high-dimensional linear quantile regression and show that a modified BIC is consistent in model selection when the number of variables diverges as the sample size increases. We also discuss how it can be used for choosing the regularization parameters of penalized approaches that are designed to conduct variable selection and shrinkage estimation simultaneously. Moreover, we extend the results to structured nonparametric quantile models with a diverging number of covariates. We illustrate our theoretical results via some simulated examples and a real data analysis on human eye disease. Supplementary materials for this article are available online.
Year of publication: |
2014
|
---|---|
Authors: | Lee, Eun Ryung ; Noh, Hohsuk ; Park, Byeong U. |
Published in: |
Journal of the American Statistical Association. - Taylor & Francis Journals, ISSN 0162-1459. - Vol. 109.2014, 505, p. 216-229
|
Publisher: |
Taylor & Francis Journals |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Sparse estimation in functional linear regression
Lee, Eun Ryung, (2012)
-
Data envelope fitting with constrained polynomial splines
Daouia, Abdelaati, (2013)
-
A cross-validatory choice of smoothing parameter in adaptive location estimation
Park, Byeong U., (1992)
- More ...