Which is better? Regularization in RKHS vs R^m on Reduced SVMs

  • Shuisheng Zhou School of Mathematics and Statistics, Xidian University

Abstract

In SVMs community, the learning results are always a combination of the selected functions. SVMs have two mainly regularization models to solve the combination coefficients. The most popular model with m input samples is norm-regularized the classification function in a reproducing kernel Hilbert space(RKHS), and it is converted to an optimization problem in R^m by duality or representer theorem. Another important model is generalized support vector machine(GSVM), in which the coefficients of the classification function is norm-regularized in a Euclidean space R^m. In this work, we analyze the difference between them on computing stability, computational complexity and the efficiency of the Newton type algorithms, especially on the reduced SVMs for large scale training problems. Many typical loss functions are considered. Our studies show that the model of GSVM has more advantages than the other model. Some experiments are given to support our analysis.

Author Biography

Shuisheng Zhou, School of Mathematics and Statistics, Xidian University
School of Mathematics and Statistics, Xidian University
Published
2013-11-28
How to Cite
Zhou, S. (2013). Which is better? Regularization in RKHS vs R^m on Reduced SVMs. Statistics, Optimization & Information Computing, 1(1), 82-106. https://doi.org/10.19139/soic.v1i1.27
Section
Research Articles