"Selection of Variables in Multivariate Regression Models for Large Dimensions"
The Akaike information criterion, AIC, and Mallows' Cp statistic have been proposed for selecting a smaller number of regressor variables in the multivariate regression models with fully unknown covariance matrix. All these criteria are, however, based on the implicit assumption that the sample size is substantially larger than the dimension of the covariance matrix. To obtain a stable estimator of the covariance matrix, it is required that the dimension of the covariance matrix be much smaller than the sample size. When the dimension is close to the sample size, it is necessary to use ridge type of estimators for the covariance matrix. In this paper, we use a ridge type of estimators for the covariance matrix and obtain the modified AIC and modified Cp statistic under the asymptotic theory that both the sample size and the dimension go to infinity. It is numerically shown that these modified procedures perform very well in the sense of selecting the true model in large dimensional cases.
Year of publication: |
2010-01
|
---|---|
Authors: | Srivastava, Muni S. ; Kubokawa, Tatsuya |
Institutions: | Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Ikeda, Yuki, (2015)
-
"Tests for Covariance Matrices in High Dimension with Less Sample Size"
Srivastava, Muni S., (2014)
-
"Tests for Multivariate Analysis of Variance in High Dimension Under Non-Normality"
Srivastava, Muni S., (2011)
- More ...