Contributions

Non-Euclidean Support Vector Machine Classifiers in Reproducing Kernel Banach Spaces

Time: 2020-07-08 09:25:48

Focusing on the sparse machine learning by 1-norm (ℓ1-norm for infinite dimension) regularization, this study generalizes the classical Support Vector Machine (SVM) Classifier by the Euclidean margin to the SVM classifier by non-Euclidean margins including the related algorithms and theorems. We first extend the linear models of the SVM classifier by non-Euclidean margins including the hard-margin case and the soft-margin case. Specially, the SVM classifier by the ∞-norm margins can be solved by the 1-norm regularized SVM classifier with the sparsity. Next, we show that the non-linear models of the SVM classifiers by the q-norm margins can be equivalently transferred to the SVM in the p-norm reproducing kernel Banach spaces given by the hinge loss, where 1/p+1/q = 1. In this way, we construct the connection between the non-Euclidean SVM classifier and RKBSs.


Furthermore, due to the attention at the sparsity, we explore the reasons for the sparsity by the 1-norm SVM classifier and the regularization network in RKBSs with the ℓ1 norm. Using a representer theorem for minimal norm interpolation problems by the theorem of convex analysis, we present a sparse representer theorem in this study. It states that by regularization networks in RKBSs with the ℓ1 norm, we can obtain a sparse minizier which can be represented as the linear combination of kernel functions centered at several adaptive points rather than training points, but the number of them is less than or equal to the number of training points. It is still an issue to find out these adpative points.



Reference:

  • [1] Ying Lin & Qi Ye, Support vector machine classifiers by non-Euclidean margins, MFC, (2020). linkdoi.

  • [2] Ying Lin, Rongrong Lin & Qi Ye, Sparse regularized learning in the reproducing kernel Banach spaces with the ℓ1 norm, MFC, (2020). linkdoi.