Skip to main content

Increasing the Training Speed of SVM, the Zoutendijk Algorithm Case

  • Conference paper
Advanced Distributed Systems (ISSADS 2005)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 3563))

Included in the following conference series:

Abstract

The Support Vector Machine (SVM) is a well known method used for classification, regression and density estimation. Training a SVM consists in solving a Quadratic Programming (QP) problem. The QP problem is very resource consuming (computational time and computational memory), because the quadratic form is dense and the memory requirements grow square the number of data points. The support vectors found in the training of SVM’s represent a small subgroup of the training patterns. If an algorithm could make an approximation beforehand of the points standing for support vectors, we could train the SVM only with those data and the same results could be obtained as trained using the entire data base.

This paper introduces an original initialization by the Zoutendijk method, called ZQP, to train SVM’s faster than classical ones. The ZQP method first makes a fast approximation to the solution using the Zoutendijk algorithm. As result of this approximation, a reduced number of training patterns is obtained. Finally, a QP algorithm makes the training with this subset of data. Results show the improvement of the methodology in comparison to QP algorithm and chunking with QP algorithm.

The ideas presented here can be extended to another problems such as resource allocation, considering that allocation as a combinatorial problem, that could be solved using some artificial intelligent technique such as Genetic algorithms or simulated annealing. In such approach ZQP would be used as a measure for effective fitness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.: Statistical Learning Theory. Journal of the Association for Computing Machinery 40, 741–764 (1993)

    MathSciNet  Google Scholar 

  2. Osune, E. et al.: Suppot Vector Machines: Training and applications. In Massachusetts Institute of Technology, Marzo de (1997)

    Google Scholar 

  3. Platt, J.: Fast Training of Support Vector Machines using Sequential Minimal Optimization. In: Microsoft Research, MIT Press, Cambridge (1998)

    Google Scholar 

  4. Boser, B.E., Guyon, I., Vapik, V.: A training algorithm for optimal margin classifiers. In: Proceedings of the 5th Annual ACM Workshop on Computational learning Theory, pp. 144–152 (1992)

    Google Scholar 

  5. Veckman, V.: Learning and Soft Computing. MIT Press, Cambidge, Cambridge (2001)

    Google Scholar 

  6. Cristanni, N., Shawe, J.: An Introduction to Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2002)

    Google Scholar 

  7. Vapnik, V.: An Overview of Statistical Learning Theory. IRRR Transactions on Neural Networks 10(5) (september 1999)

    Google Scholar 

  8. Vapnik, V., Chervonenkis, A.J.: One the uniform convergence of relative frequencies of events to their probabilities. Theory of probabilities and its applications 16, 264–280 (1971)

    Article  MATH  Google Scholar 

  9. Vapnik, V.: Minimization of expected risk based on empirical data. In: Proc. of the 1st World Congress of the Bernoulli Society VNUSCIENCEPRESS, Utrecht, the Netherlands, vol. 2, pp. 821–832 (1987)

    Google Scholar 

  10. Vapnik, V., Chervonenkis, A.J.: The necessary and sufficient conditions for consistency in the empirical risk minimization method. Pattern Recognition and Image Analysis 1(3), 283–305 (1991)

    Google Scholar 

  11. Frausto, J., Rivera, R., et al.: Fast hard Linear Problem resolution using Simulatead Annealing and Datzing’s Rule. In: Proc. of IASTED Int. Conf. in Artificial Intelligence and Soft Computing (ASC 2001) (May 2001)

    Google Scholar 

  12. Frausto, J., Rivera, R.: A simplex Genetic Method for Solving the Klee Minty Cube. Transactions on Systems 2(1), 232–237 (2002) ISSN 1109-2777

    Google Scholar 

  13. Bazaraa, M., Shetty, C.M.: Nonlinear Programing, Theory and Algorithms. In: School of Industrial and Systems Enginnering, Georgia Institute of Technology, Atlanta, Georgia, John Wiley & Sons, Chichester (1979)

    Google Scholar 

  14. Chvátal, V.: Linear programming. W.H. freeman and Company, New York (2000)

    Google Scholar 

  15. Flores R., Hernández A.: Uso de algoritmos genéticos para detección de vectores de soporte en el entrenamiento de Máquinas de Soporte Vectorial. Memorias del Primer Congreso Mexicano de Computación Evolutiva, México (2003)

    Google Scholar 

  16. Va˘xiangrong, Z., Fang, L.: A Pattern Classification Method Based on GA and SVM. In: 6th International Conference on Signal Processing, 2002, August 2002, vol. 1, pp. 26–30, vol.1, pp. 110–113 (2002)

    Google Scholar 

  17. Rychetsky, M., Ortmann, S., Ullmann, M., And Glesner, M.: Accelerated training of Support Vector Machines. In: IJCNN 1999. International Joint Conference on Neural Networks (July 1999)

    Google Scholar 

  18. Yang, M.-H., Ahuja, N.A.: A Geometric Approach to Train Support Vector Machines. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (June 2000)

    Google Scholar 

  19. GarcÍA, A., HernÁNdez, N.: Reducing the training speed of Support Vector Machines By Barycentric Correction Procedure. Wseas Transactions. On Systems 3(3) (May 2004) ISSN 1109-2777

    Google Scholar 

  20. Stephens, C., Mora, J.: Effective fitness as an alternative paradigm for evolutionary computation I: General Formalism Genetic Programming. And Evolvable Machines 1(4) (October 2000) ISSN 1389-2576

    Google Scholar 

  21. Blake, C.L., Mers: UCI Repository of machine learning databases. Department of Information and Computer Science (1998), http://www.ics.uci.edu/mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ibarra Orozco, R.E., Hernández-Gress, N., Frausto-Solís, J., Mora Vargas, J. (2005). Increasing the Training Speed of SVM, the Zoutendijk Algorithm Case. In: Ramos, F.F., Larios Rosillo, V., Unger, H. (eds) Advanced Distributed Systems. ISSADS 2005. Lecture Notes in Computer Science, vol 3563. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11533962_28

Download citation

  • DOI: https://doi.org/10.1007/11533962_28

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28063-7

  • Online ISBN: 978-3-540-31674-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics