DirectSVM: A Simple Support Vector Machine Perceptron |
| |
Authors: | Danny Roobaert |
| |
Affiliation: | (1) Artificial Intelligence Lab, Department of Computer Science, University of Toronto, Toronto, ON, M5S 3H5, Canada |
| |
Abstract: | We propose a very simple learning algorithm, DirectSVM, for constructing support vector machine classifiers. This new algorithm is based on the proposition that the two closest training points of opposite class in a training set are support vectors, on condition that the training points in the set are linearly independent. The latter condition is always satisfied for soft-margin support vector machines with quadratic penalties. Other support vectors are found using the following conjecture: the training point that maximally violate the current hyperplane is also a support vector. We show that DirectSVM converges to a maximal margin hyperplane in M – 2 iterations, if the number of support vectors is M. DirectSVM is evaluated empirically on a number of standard databases. Performance-wise, the algorithm generalizes similarly as other implementations. Speed-wise, the proposed method is faster than a standard quadratic programming approach, while it has the potential to be competitive with current state-of-the-art SVM implementations. |
| |
Keywords: | support vector machines perceptron maximal soft-margin hyperplane pattern recognition |
本文献已被 SpringerLink 等数据库收录! |
|