A generalized Dantzig selector with shrinkage tuning
The Dantzig selector performs variable selection and model fitting in linear regression. It uses an L<sub>1</sub> penalty to shrink the regression coefficients towards zero, in a similar fashion to the lasso. While both the lasso and Dantzig selector potentially do a good job of selecting the correct variables, they tend to overshrink the final coefficients. This results in an unfortunate trade-off. One can either select a high shrinkage tuning parameter that produces an accurate model but poor coefficient estimates or a low shrinkage parameter that produces more accurate coefficients but includes many irrelevant variables. We extend the Dantzig selector to fit generalized linear models while eliminating overshrinkage of the coefficient estimates, and develop a computationally efficient algorithm, similar in nature to least angle regression, to compute the entire path of coefficient estimates. A simulation study illustrates the advantages of our approach relative to others. We apply the methodology to two datasets. Copyright 2009, Oxford University Press.
Year of publication: |
2009
|
---|---|
Authors: | James, Gareth M. ; Radchenko, Peter |
Published in: |
Biometrika. - Biometrika Trust, ISSN 0006-3444. - Vol. 96.2009, 2, p. 323-337
|
Publisher: |
Biometrika Trust |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
DASSO: connections between the Dantzig selector and lasso
James, Gareth M., (2009)
-
Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions
Radchenko, Peter, (2010)
-
Variable Inclusion and Shrinkage Algorithms
Radchenko, Peter, (2008)
- More ...