Tuning support vector machines for minimax and Neyman-Pearson classification.
IEEE Trans Pattern Anal Mach Intell
; 32(10): 1888-98, 2010 Oct.
Article
en En
| MEDLINE
| ID: mdl-20724764
This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2nu-SVM. We then exploit a characterization of the 2nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Idioma:
En
Revista:
IEEE Trans Pattern Anal Mach Intell
Asunto de la revista:
INFORMATICA MEDICA
Año:
2010
Tipo del documento:
Article
País de afiliación:
Estados Unidos
Pais de publicación:
Estados Unidos