Large deviations and estimation in infinite-dimensional models
Consider a random sample from a statistical model with an unknown, and possibly infinite-dimensional, parameter - e.g., a nonparametric or semiparametric model - and a real-valued functional T of this parameter which is to be estimated. The objective is to develop bounds on the (negative) exponential rate at which consistent estimates converge in probability to T, or, equivalently, lower bounds for the asymptotic effective standard deviation of such estimates - that is, to extend work of R.R. Bahadur from parametric models to more general (semiparametric and nonparametric) models. The approach is to define a finite-dimensional submodel, determine Bahadur's bounds for a finite-dimensional model, and then 'sup' or 'inf' the bounds with respect to ways of defining the submodels; this can be construed as a 'directional approach', the submodels being in a specified 'direction' from a specific model. Extension is made to the estimation of vector-valued and infinite-dimensional functionals T, by expressing consistency in terms of a distance, or, alternatively, by treating classes of real functionals of T. Several examples are presented.
Year of publication: |
1988
|
---|---|
Authors: | Hall, W. J. ; Huang, Wei-Min |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 6.1988, 6, p. 433-439
|
Publisher: |
Elsevier |
Keywords: | consistency rates effective standard deviation Kullback-Leibler information nonparametric estimation semiparametric estimation |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Yen, David C., (2016)
-
Asymptotic theorems for estimating the distribution function under random truncation
Huang, Wei-Min, (1988)
-
Optimum bandwidths and kernels for estimating certain discontinuous densities
Ghosh, B., (1992)
- More ...