Model selection by sequentially normalized least squares
Model selection by means of the predictive least squares (PLS) principle has been thoroughly studied in the context of regression model selection and autoregressive (AR) model order estimation. We introduce a new criterion based on sequentially minimized squared deviations, which are smaller than both the usual least squares and the squared prediction errors used in PLS. We also prove that our criterion has a probabilistic interpretation as a model which is asymptotically optimal within the given class of distributions by reaching the lower bound on the logarithmic prediction errors, given by the so called stochastic complexity, and approximated by BIC. This holds when the regressor (design) matrix is non-random or determined by the observed data as in AR models. The advantages of the criterion include the fact that it can be evaluated efficiently and exactly, without asymptotic approximations, and importantly, there are no adjustable hyper-parameters, which makes it applicable to both small and large amounts of data.
Year of publication: |
2010
|
---|---|
Authors: | Rissanen, Jorma ; Roos, Teemu ; Myllymäki, Petri |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 101.2010, 4, p. 839-849
|
Publisher: |
Elsevier |
Keywords: | Linear regression Time series Model selection Order estimation Predictive least squares |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
General - Coding and Compression: A Happy Union of Theory and Practice
Rissanen, Jorma, (2000)
-
Stochastic complexity and the MDL principle
Rissanen, Jorma, (1987)
-
Information, complexity and the MDL principle
Rissanen, Jorma, (2001)
- More ...