Least squares approximation with a diverging number of parameters
Regularized regression with the l1 penalty is a popular approach for variable selection and coefficient estimation. For a unified treatment of the l1-constrained model selection, Wang and Leng (2007) proposed the least squares approximation method (LSA) for a fixed dimension. LSA makes use of a quadratic expansion of the loss function and takes full advantage of the fast Lasso algorithm in Efron et al. (2004). In this paper, we extend the fixed dimension LSA to the situation with a diverging number of parameters. We show that LSA possesses the oracle properties under appropriate conditions when the number of variables grows with the sample size. We propose a new tuning parameter selection method which achieves the oracle properties. Extensive simulation studies confirmed the theoretical results.
Year of publication: |
2010
|
---|---|
Authors: | Leng, Chenlei ; Li, Bo |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 80.2010, 3-4, p. 254-261
|
Publisher: |
Elsevier |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Shrinkage tuning parameter selection with a diverging number of parameters
Wang, Hansheng, (2009)
-
Forward adaptive banding for estimating large covariance matrices
Leng, Chenlei, (2011)
-
Shrinkage Tuning Parameter Selection with a Diverging Number of Parameters
Wang, Hansheng, (2008)
- More ...