Optimal vs. classical linear dimension reduction
We describe a computer intensive method for linear dimension reduction which minimizes the classification error directly. Simulated annealing Bohachevsky et al (1986) is used to solve this problem. The classification error is determined by an exact integration. We avoid distance or scatter measures which are only surrogates to circumvent the classification error. Simulations in two dimensions and analytical approximations demonstrate the superiority of optimal classification opposite to the classical procedures. We compare our procedure to the well-known canonical discriminant analysis (homoscedastic case) as described in Mc Lachlan (1992) and to a method by Young et al (1986) for the heteroscedastic case. Special emphasis is put on the case when the distance based methods collapse. The computer intensive algorithm always achieves minimal classification error.
Year of publication: |
1998
|
---|---|
Authors: | Röhl, Michael C. ; Weihs, Claus |
Publisher: |
Dortmund : Universität Dortmund, Sonderforschungsbereich 475 - Komplexitätsreduktion in Multivariaten Datenstrukturen |
Saved in:
freely available
Series: | Technical Report ; 1998,12 |
---|---|
Type of publication: | Book / Working Paper |
Type of publication (narrower categories): | Working Paper |
Language: | English |
Other identifiers: | 816121737 [GVK] hdl:10419/77322 [Handle] RePEc:zbw:sfb475:199812 [RePEc] |
Source: |
Persistent link: https://www.econbiz.de/10010316665
Saved in favorites
Similar items by person
-
Optimal vs. classical linear dimension reduction
Röhl, Michael C., (1998)
-
Variance reduction with Monte Carlo estimates of error rates in multivariate classification
Weihs, Claus, (1999)
-
Direct minimization of error rates in multivariate classification
Röhl, Michael C., (1999)
- More ...