Incorporating prior probabilities into high-dimensional classifiers
In standard parametric classifiers, or classifiers based on nonparametric methods but where there is an opportunity for estimating population densities, the prior probabilities of the respective populations play a key role. However, those probabilities are largely ignored in the construction of high-dimensional classifiers, partly because there are no likelihoods to be constructed or Bayes risks to be estimated. Nevertheless, including information about prior probabilities can reduce the overall error rate, particularly in cases where doing so is most important, i.e. when the classification problem is particularly challenging and error rates are not close to zero. In this paper we suggest a general approach to reducing error rate in this way, by using a method derived from Breiman's bagging idea. The potential improvements in performance are identified in theoretical and numerical work, the latter involving both applications to real data and simulations. The method is simple and explicit to apply, and does not involve choice of any tuning parameters. Copyright 2010, Oxford University Press.
Year of publication: |
2010
|
---|---|
Authors: | Hall, Peter ; Xue, Jing-Hao |
Published in: |
Biometrika. - Biometrika Trust, ISSN 0006-3444. - Vol. 97.2010, 1, p. 31-48
|
Publisher: |
Biometrika Trust |
Saved in:
Saved in favorites
Similar items by person
-
On selecting interacting features from high-dimensional data
Hall, Peter, (2014)
-
Tilting methods for assessing the influence of components in a classifier
Hall, Peter, (2009)
-
Median-Based Classifiers for High-Dimensional Data
Hall, Peter, (2009)
- More ...