Statistical inference of minimum BD estimators and classifiers for varying-dimensional models
Stochastic modeling for large-scale datasets usually involves a varying-dimensional model space. This paper investigates the asymptotic properties, when the number of parameters grows with the available sample size, of the minimum- estimators and classifiers under a broad and important class of Bregman divergence (), which encompasses nearly all of the commonly used loss functions in the regression analysis, classification procedures and machine learning literature. Unlike the maximum likelihood estimators which require the joint likelihood of observations, the minimum-BD estimators are useful for a range of models where the joint likelihood is unavailable or incomplete. Statistical inference tools developed for the class of large dimensional minimum- estimators and related classifiers are evaluated via simulation studies, and are illustrated by analysis of a real dataset.
Year of publication: |
2010
|
---|---|
Authors: | Zhang, Chunming |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 101.2010, 7, p. 1574-1593
|
Publisher: |
Elsevier |
Keywords: | A diverging number of parameters Exponential family Hemodynamic response function Loss function Optimal Bayes rule |
Saved in:
Saved in favorites
Similar items by person
-
Calibrating the degrees of freedom fir automatic data smoothing and effective curve checking
Zhang, Chunming, (2003)
-
A power comparison between nonparametric regression tests
Zhang, Chunming, (2003)
-
A reexamination of diffusion estimators with applications to financial model validdation
Fan, Jianqing, (2003)
- More ...