The Kullback information criterion for mixture regression models
We consider the problem of jointly selecting the number of components and variables in finite mixture regression models. The classical model selection criterion, AIC or BIC, may not be satisfactory in this setting, especially when the sample size is small or the number of variables is large. Specifically, they fit too many components and retain too many variables. An alternative mixture regression criterion, called MRC, which simultaneously determines the number of components and variables in mixture regression models, was proposed by Naik et al. (2007). In the same setting, we propose a new information criterion, called , for the simultaneous determination of the number of components and predictors. is based on the Kullback symmetric divergence instead of the Kullback directed divergence used for MRC. We show that the new criterion performs well than MRC in a small simulation study.
Year of publication: |
2010
|
---|---|
Authors: | Hafidi, Bezza ; Mkhadri, Abdallah |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 80.2010, 9-10, p. 807-815
|
Publisher: |
Elsevier |
Saved in:
Saved in favorites
Similar items by person
-
A small-sample criterion based on Kullback's symmetric divergence for vector autoregressive modeling
Hafidi, Bezza, (2006)
-
A group VISA algorithm for variable selection
Mkhadri, Abdallah, (2015)
-
An extended variable inclusion and shrinkage algorithm for correlated variables
Mkhadri, Abdallah, (2013)
- More ...