Adapting Kernels by Variational Approach in SVM
Support vector machines are increasingly being used for problems involving both regression and classification. An inherent limitation of the original model formulation meant that the models output could not be interpreted probabilistic framework. This issue has recently been addressed, so that the possibility of using a Bayesian framework has been offered. However, a problem that remains with Bayesian inference is the intractable nature of the multi-dimensional integrals typically required, making exact evaluation impossible. Variational learning has been proposed as being a method of approximating these integrals. This paper proposed a variational Bayesian approach for the SVM regression based on the likelihood model of an infinite mixture of Gaussians. To evaluate this approach the method was applied to both synthetic and real-world datasets. In this paper we compare this new approximation approach with the standard SVM algorithm as well as other well established methods such as Gaussian Process, which show this new method to be competitive
Year of publication: |
2018
|
---|---|
Authors: | Gao, J.B. |
Other Persons: | Gunn, S.R. (contributor) ; Kandola, J.S. (contributor) |
Publisher: |
[2018]: [S.l.] : SSRN |
Subject: | Core | Mustererkennung | Pattern recognition | Schätztheorie | Estimation theory |
Saved in:
freely available
Saved in favorites
Similar items by subject
-
Kota, Krishna, (2015)
-
Wang, Jujie, (2023)
-
Support vector regression for time series analysis
De Leone, Renato, (2011)
- More ...
Similar items by person