Conditional and unconditional methods for selecting variables in linear mixed models
In the problem of selecting the explanatory variables in the linear mixed model, we address the derivation of the (unconditional or marginal) Akaike information criterion (AIC) and the conditional AIC (cAIC). The covariance matrices of the random effects and the error terms include unknown parameters like variance components, and the selection procedures proposed in the literature are limited to the cases where the parameters are known or partly unknown. In this paper, AIC and cAIC are extended to the situation where the parameters are completely unknown and they are estimated by the general consistent estimators including the maximum likelihood (ML), the restricted maximum likelihood (REML) and other unbiased estimators. We derive, related to AIC and cAIC, the marginal and the conditional prediction error criteria which select superior models in light of minimizing the prediction errors relative to quadratic loss functions. Finally, numerical performances of the proposed selection procedures are investigated through simulation studies.
Year of publication: |
2011
|
---|---|
Authors: | Kubokawa, Tatsuya |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 102.2011, 3, p. 641-660
|
Publisher: |
Elsevier |
Keywords: | Akaike information criterion Best linear unbiased predictor Fay-Herriot model Linear mixed model Maximum likelihood estimator Nested error regression model Prediction error Restricted maximum likelihood estimator Small area estimation |
Saved in:
Saved in favorites
Similar items by person
-
Closer estimators of a common mean in the sense of Pitman
Kubokawa, Tatsuya, (1989)
-
Kubokawa, Tatsuya, (1997)
-
"Estimation of Several Wishart Mean Matrices"
Tsai, Ming-Tien, (2005)
- More ...