Optimal vs. classical linear dimension reduction
We describe a computer intensive method for linear dimension reduction which minimizes the classification error directly. Simulated annealing Bohachevsky et al (1986) is used to solve this problem. The classification error is determined by an exact integration. We avoid distance or scatter measures which are only surrogates to circumvent the classification error. Simulations in two dimensions and analytical approximations demonstrate the superiority of optimal classification opposite to the classical procedures. We compare our procedure to the well-known canonical discriminant analysis (homoscedastic case) as described in Mc Lachlan (1992) and to a method by Young et al (1986) for the heteroscedastic case. Special emphasis is put on the case when the distance based methods collapse. The computer intensive algorithm always achieves minimal classification error.
Year of publication: |
1998
|
---|---|
Authors: | Röhl, Michael C. ; Weihs, Claus |
Institutions: | Institut für Wirtschafts- und Sozialstatistik, Universität Dortmund |
Saved in:
Saved in favorites
Similar items by person
-
Variance reduction with Monte Carlo estimates of error rates in multivariate classification
Weihs, Claus, (1999)
-
Direct minimization of error rates in multivariate classification
Röhl, Michael C., (1999)
-
Multivariate classification of business phases
Weihs, Claus, (1999)
- More ...