Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates
Let X1,..., X1 and Y1,..., Yn be independent random samples from the distribution functions (d.f.) F and G respectively. Assume that F' = f and G' = g. The discriminant rule for classifying and independently sampled observation Z to F if and to G, otherwise where l and n are the estimates of f and g respectively based on a common kernel function and the training X- and Y-samples, are considered optimal in some sense. Let Pf denote the probability measure under the assumption that Z ~ F and set P0 = Pf(f(Z) > g(Z)) and . In this article we have derived the rate at which PN --> P0 as N = l + n --> [infinity], for the situation where l = n, F(x) = TM(x - [theta]2) and G(x) = M(x - [theta]1) for some symmetric d.f. M and parameters [theta]1, [theta]2. We have examined a few special cases of M and have established that the rate of convergence of PN to P0 depends critically on the tail behavior of m = M'.
Year of publication: |
1989
|
---|---|
Authors: | Chanda, Kamal C. ; Ruymgaart, F. H. |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 8.1989, 1, p. 81-88
|
Publisher: |
Elsevier |
Keywords: | optimal classification rule probability of misclassification kernel function density estimates |
Saved in:
Saved in favorites
Similar items by person
-
Density Estimation for a Class of Stationary Nonlinear Processes
Chanda, Kamal C., (2003)
-
Sampling distribution for a class of estimators for nonregular linear processes
Chanda, Kamal C., (1985)
-
Large sample properties of spectral estimators for a class of stationary nonlinear processes
Chanda, Kamal C., (2005)
- More ...