Learning the optimal kernel for Fisher discriminant analysis via second order cone programming
Kernel Fisher discriminant analysis (KFDA) is a popular classification technique which requires the user to predefine an appropriate kernel. Since the performance of KFDA depends on the choice of the kernel, the problem of kernel selection becomes very important. In this paper we treat the kernel selection problem as an optimization problem over the convex set of finitely many basic kernels, and formulate it as a second order cone programming (SOCP) problem. This formulation seems to be promising because the resulting SOCP can be efficiently solved by employing interior point methods. The efficacy of the optimal kernel, selected from a given convex set of basic kernels, is demonstrated on UCI machine learning benchmark datasets.
Year of publication: |
2010
|
---|---|
Authors: | Khemchandani, Reshma ; Jayadeva ; Chandra, Suresh |
Published in: |
European Journal of Operational Research. - Elsevier, ISSN 0377-2217. - Vol. 203.2010, 3, p. 692-697
|
Publisher: |
Elsevier |
Keywords: | Fisher discriminant analysis Kernel methods Machine learning Kernel optimization Support vector machines Convex optimization Second order cone programming Semidefinite programming |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Knowledge based proximal support vector machines
Khemchandani, Reshma, (2009)
-
Knowledge based proximal support vector machines
Khemchandani, Reshma, (2009)
-
Learning the optimal kernel for Fisher discriminant analysis via second order cone programming
Khemchandani, Reshma, (2010)
- More ...