An Almost Surely Optimal Combined Classification Rule
We propose a data-based procedure for combining a number of individual classifiers in order to construct more effective classification rules. Under some regularity conditions, the resulting combined classifier turns out to be almost surely superior to each of the individual classifiers. Here, superiority means lower misclassification error rate.
Year of publication: |
2002
|
---|---|
Authors: | Mojirsheibani, Majid |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 81.2002, 1, p. 28-46
|
Publisher: |
Elsevier |
Keywords: | Bayes rule misclassification error consistencey Vapnik-Chervonenkis |
Saved in:
Saved in favorites
Similar items by person
-
A Note on the Strong Approximation of the Smoothed Empirical Process of α-mixing Sequences
Mojirsheibani, Majid, (2006)
-
Theory and Methods - Combining Classifiers via Discretization
Mojirsheibani, Majid, (1999)
-
An iterated classification rule based on auxiliary pseudo-predictors
Mojirsheibani, Majid, (2001)
- More ...