Showing 1 - 9 of 9
Cross-validation based point estimates of prediction accuracy are frequently reported in microarray class prediction problems. However these point estimates can be highly variable, particularly for small sample numbers, and it would be useful to provide confidence intervals of prediction...
Persistent link: https://www.econbiz.de/10005246518
This paper proposes two consistent model selection procedures for factor-augmented regressions in finite samples. We first demonstrate that the usual cross-validation is inconsistent, but that a generalization, leave-d-out cross-validation, selects the smallest basis for the space spanned by the...
Persistent link: https://www.econbiz.de/10011939442
We develop two new methods for selecting the penalty parameter for the e1-penalized high-dimensional M-estimator, which we refer to as the analytic and bootstrap-after-cross-validation methods. For both methods, we derive nonasymptotic error bounds for the corresponding e1-penalized M-estimator...
Persistent link: https://www.econbiz.de/10013253002
We develop two new methods for selecting the penalty parameter for the l1 -penalized high-dimensional M-estimator, which we refer to as the analytic and bootstrap-aftercross-validation methods. For both methods, we derive nonasymptotic error bounds for the corresponding l1 -penalized M-estimator...
Persistent link: https://www.econbiz.de/10012621158
The problem of prediction is revisited with a view towards going beyond the typical nonparametric setting and reaching a fully model-free environment for predictive inference, i.e., point predictors and predictive intervals. A basic principle of model-free prediction is laid out based on the...
Persistent link: https://www.econbiz.de/10010676431
Persistent link: https://www.econbiz.de/10012627495
This paper proposes two consistent model selection procedures for factor-augmented regressions in finite samples. We first demonstrate that the usual cross-validation is inconsistent, but that a generalization, leave-d-out cross-validation, selects the smallest basis for the space spanned by the...
Persistent link: https://www.econbiz.de/10011756075
We develop two new methods for selecting the penalty parameter for the l1 -penalized high-dimensional M-estimator, which we refer to as the analytic and bootstrap-aftercross-validation methods. For both methods, we derive nonasymptotic error bounds for the corresponding l1 -penalized M-estimator...
Persistent link: https://www.econbiz.de/10012501445
We develop two new methods for selecting the penalty parameter for the e1-penalized high-dimensional M-estimator, which we refer to as the analytic and bootstrap-after-cross-validation methods. For both methods, we derive nonasymptotic error bounds for the corresponding e1-penalized M-estimator...
Persistent link: https://www.econbiz.de/10012800795