Weak consistency of least-squares estimators in linear models
Let Yn, n>=1, be a sequence of integrable random variables with EYn = xn1[beta]1 + xn2[beta]2 + ... + xnp[beta]p, where the xij's are known and [beta]T = ([beta]1, [beta]2,..., [beta]p) unknown. Let bn be the least-squares estimator of [beta] based on Y1, Y2,..., Yn. Weak consistency of bn, n>=1, has been considered in the literature under the assumption that each Yn is square integrable. In this paper, we study weak consistency of bn, n>=1, and associated rates of convergence under the minimal assumption that each Yn is integrable.
Year of publication: |
1982
|
---|---|
Authors: | Kaffes, D. ; Bhaskara Rao, M. |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 12.1982, 2, p. 186-198
|
Publisher: |
Elsevier |
Keywords: | Linear models convergence in probability weak consistency estimable linear functions rates of convergence |
Saved in:
Saved in favorites
Similar items by person
-
Some results on strong limit theorems for (LB)-space-valued random variables
Wang, Xaingchen, (1995)
-
Complete convergence of moving average processes
Li, Deli, (1992)
-
The Law of the Iterated Logarithm and Central Limit Theorem for L-Statistics
Li, Deli, (2001)
- More ...