A note on margin-based loss functions in classification
In many classification procedures, the classification function is obtained by minimizing a certain empirical risk on the training sample. The classification is then based on the sign of the classification function. In recent years, there have been a host of classification methods proposed that use different margin-based loss functions. The margin-based loss functions are often motivated as upper bounds of the misclassification loss, but this cannot explain the statistical properties of the classification procedures. We show that a large family of margin-based loss functions are Fisher consistent for classification. That is, the population minimizer of the loss function leads to the Bayes optimal rule of classification. Our result covers almost all margin-based loss functions that have been proposed in the literature. We give an inequality that links the Fisher consistency of margin-based loss functions with the consistency of methods based on these loss functions. We use this inequality to obtain the rate of convergence for the method of sieves based on a class of margin-based loss functions.
Year of publication: |
2004
|
---|---|
Authors: | Lin, Yi |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 68.2004, 1, p. 73-82
|
Publisher: |
Elsevier |
Keywords: | Bayes rule of classification Fisher consistency Margin Method of regularization Method of sieves |
Saved in:
Saved in favorites
Similar items by person
-
Grey game theory and its applications in economic decision-making
Fang, Zhigeng, (2010)
-
Hedging, Hedge Accounting, and Earnings Predictability
Ranasinghe, Tharindra, (2021)
-
Chiu, Chui-Yu, (2014)
- More ...