A note on the interpretation of the Bahadur bound and the rate of convergence of the maximum likelihood estimator
Some interpretation of the Bahadur bound and the rate of convergence of the maximum likelihood estimator is provided using a theorem of Fu (1982) and the geometrical methods discussed in Kass (1984). We focus on replicated nonlinear regression and show that, in the sense of rate of convergence of the least-squares estimator in a small neighborhood of the true model, the most important characteristic that distinguishes one family of models from another is its statistical curvature (which is a multiple of the 'intrinsic curvature' of Bates and Watts, 1980).
Year of publication: |
1984
|
---|---|
Authors: | Fu, James C. ; Kass, Robert E. |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 2.1984, 5, p. 269-273
|
Publisher: |
Elsevier |
Saved in:
Saved in favorites
Similar items by person
-
Shrinkage Estimators for Covariance Matrices
Daniels, Michael J., (2001)
-
Nonconjugate Bayesian Analysis of Variance Component Models
Wolfinger, Russell D., (2000)
-
Theory and Methods - Reference Bayesian Methods for Generalized Linear Mixed Models
Natarajan, Ranjini, (2000)
- More ...