Data di Pubblicazione:
2018
Abstract:
Support Vector Machine (SVM) is one of the most important class of machine learning models and algorithms, and has been successfully applied in various fields. Nonlinear optimization plays a crucial role in SVM methodology, both in defining the machine learning models and in designing convergent and efficient algorithms for large-scale training problems. In this paper we present the convex programming problems underlying SVM focusing on supervised binary classification. We analyze the most important and used optimization methods for SVM training problems, and we discuss how the properties of these problems can be incorporated in designing useful algorithms.
Tipologia CRIS:
01.01 Articolo in rivista
Keywords:
Convex quadratic programming; Kernel functions; Nonlinear optimization methods; Statistical learning theory; Support vector machine; Wolfe's dual theory
Elenco autori:
Sciandrone, Marco
Link alla scheda completa:
Pubblicato in: