Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Advertisement

Abstract

We propose a very simple learning algorithm, DirectSVM, for constructing support vector machine classifiers. This new algorithm is based on the proposition that the two closest training points of opposite class in a training set are support vectors, on condition that the training points in the set are linearly independent. The latter condition is always satisfied for soft-margin support vector machines with quadratic penalties. Other support vectors are found using the following conjecture: the training point that maximally violate the current hyperplane is also a support vector. We show that DirectSVM converges to a maximal margin hyperplane in M − 2 iterations, if the number of support vectors is M. DirectSVM is evaluated empirically on a number of standard databases. Performance-wise, the algorithm generalizes similarly as other implementations. Speed-wise, the proposed method is faster than a standard quadratic programming approach, while it has the potential to be competitive with current state-of-the-art SVM implementations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. E. Osuna, R. Freund, and F. Girosi, “An Improved Training Algorithm for Support Vector Machines,” in Proc. IEEE Neural Netw. for Sign. Proc. NNSP97, 1997, pp. 276-285.

  2. J. Platt, “Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines,” Technical Report MSR-TR-98-14, Microsoft Research, 1998.

  3. T. Joachims, “Making Large-Scale Support Vector Machine Learning Practical,” in Advances in Kernel Methods, B. Schölkopf, C. Burges, and A. Smola (Eds.), 1999, pp. 169-184.

  4. T. Friess, “Support Vector Neural Networks: The Kernel-Adatron with Bias and Soft-Margin,” Technical Report, The University of Sheffield, 1999.

  5. O.L. Mangasarian and D.R. Musicant, “Successive Overrelaxation for Support Vector Machines,” Technical Report, Comp. Science Dept., University of Wisconsin, Madison, WI, USA, 1998.

    Google Scholar 

  6. Y. Li and P. Long, “The Relaxed Online Maximum Margin Algorithm,” in Adv. in Neural Information Processing Systems 13, USA: Denver, 1999.

  7. A. Kowalczyk, “Maximal Margin Perceptron,” in Advances in Large Margin Classifiers, A. Smola, et al. (Eds.), 2000, pp. 75-113.

  8. S.S. Keerthi, S.K. Shevade, C. Bhattacharyya, and K.R.K. Murthy, “A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design,” Technical Report TR-ISL-99-03, Dept. Comp. Science and Automation, Indian Institute of Science, Bangalore, India.

  9. B. Boser, I. Guyon, and V. Vapnik, “A Training Algorithm for Optimal Margin Classifiers,” in 5th Ann. Workshop on Computational Learning Theory, Pittsburgh ACM, 1992, pp. 144-152.

  10. V.N. Vapnik, Statistical Learning Theory, in series Adaptive & Learning Systems for Signal Proc., Comm. & Control, New York: John Wiley, 1998.

    Google Scholar 

  11. C. Cortes and V. Vapnik, “Support-Vector Networks,” Machine Learning, vol. 20, 1995, pp. 1-25.

    Google Scholar 

  12. J. Shawe-Taylor and N. Cristianini, “Margin Distribution and Soft Margin,” in Advances in Large Margin Classifiers, A. Smola, et al. (Eds.), 2000, pp. 349-358.

  13. K. Bennett and E.J. Bredensteiner, “Geometry in Learning,” Technical Report, Dept. Math. Sciences, Rennselaer Polytechnic Institute, New York, 1996.

    Google Scholar 

  14. C. Saunders, M.O. Stitson, J. Weston, L. Bottou, B. Schölkopf, and A. Smola, “Support Vector Machine—Reference Manual,” Royal Holloway Technical Report, CSD-TR-98-03, 1998.

  15. C.-C. Chang and C.-J. Lin, “LIBSVM: A Library for Support Vector Machines,” http://www.csie.edu.tw/~cjlin/libsvm/.

  16. S.S. Keerthi, S.K. Shevade, C. Bhattacharyya, and K.R.K. Murthy, “Improvements to Platt's SMO Algorithm for SVM Classifier Design,” Technical Report, CD-99-14, Dept. Mech. and Prod. Eng., National University of Singapore.

  17. S.A. Nene, S.K. Nayar, and H. Murase, “Columbia Object Image Library (COIL-100),” Technical Report No. CUCS-006-96, Dept. Comp. Science, Columbia University, 1996.

  18. D. Roobaert, “Improving the Generalization of Linear Support Vector Machines: An Application to 3D Object Recognition with Cluttered Background,” in Proc. SVM Workshop at Int. Joint Conf. Artif. Intell. (IJCAI99), Stockholm, Sweden, Aug. 1999.

  19. R.P. Gorman and T.J. Sejnowski, “Analysis of Hidden Units in a Layered Network Trained to Classify Sonar Targets,” Neural Networks, vol. 1, 1988, pp. 75-89.

    Article  Google Scholar 

  20. V.G. Sigillito, S.P. Wing, L.V. Hutton, and K.B. Baker, “Classification of Radar Returns from the Ionosphere Using Neural Networks,” John Hopkins APL Technical Digest, vol. 10, 1989, pp. 262-266.

    Google Scholar 

  21. Y. LeCun, B. Boser, J.S. Decker, D. Henderson, R.E. Howard, W. Hubbard, and L.J. Jackel, “Handwritten Digit Recognition with a Back-Propagation Network,” in Adv. in Neural Information Processing Systems, San Mateo, California: Morgan Kaufman, vol. 2, 1990, pp. 396-404.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Roobaert, D. DirectSVM: A Simple Support Vector Machine Perceptron. The Journal of VLSI Signal Processing-Systems for Signal, Image, and Video Technology 32, 147–156 (2002). https://doi.org/10.1023/A:1016327704666

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1016327704666