Skip to main content

A Fast Approximated Evolutionary Approach to Improve SVM Accuracy

  • Conference paper
  • First Online:
Research and Development in Intelligent Systems XXVII (SGAI 2010)

Abstract

Improving the classification performance is a crucial step of any machine learning method. In order to achieve a better classification Support Vector Machines need to tune parameters and to select relevant variables. To simultaneously perform both targets an embedded approach can be considered. This method consists of a two-layer algorithm where an evolutionary approach handles the solutions and an approximated one evaluates them. The evolutionary search, based on approximated error measures computed on the kernel matrix, allows discovering solutions which have high classification accuracy. The aim of the paper is to verify whether the proposed method is able to find reliable solutions which enhance the classification performance. The proposed method is applied on three real-world datasets using three kernels. In the experiments it is compared against the enclosed Genetic Algorithms and SVMs approach to demonstrate the ability of the approximated method to achieve high classification accuracy in a shorter time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bengio, Y.: Gradient-based optimization of hyperparameters. Neural computation 12(8), 1889–1900 (2000)

    Article  MathSciNet  Google Scholar 

  2. Boser, B.E., Guyon, I.M., Vapnik, V.N.: Training algorithm for optimal margin classifiers. In: Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory (1992)

    Google Scholar 

  3. Braga, P.L., Oliveira., A.L.I., Meira, S.R.L.: A ga-based feature selection and parameters optimization for support vector regression applied to software effort estimation. In: Proceedings Alessandro Perolini of the 23rd Annual ACM Symposium on Applied Computing, SAC’08, pp. 1788–1792. Association for Computing Machinery (2008)

    Google Scholar 

  4. Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)

    Article  Google Scholar 

  5. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines (2001)

    Google Scholar 

  6. Chapelle, O., Vapnik, V.: Model Selection for Support Vector Machines (2000)

    Google Scholar 

  7. Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1-3), 131–159 (2002)

    Article  MATH  Google Scholar 

  8. Cortes, C., Vapnik, V.: Support-vector networks. Machine Learning 20(3), 273–297 (1995)

    MATH  Google Scholar 

  9. Cristianini, N., Kandola, J., Elisseeff, A., Shawe-Taylor, J.: On kernel-target alignment. In: Advances in Neural Information Processing Systems 14, vol. 14, pp. 367–373 (2002)

    Google Scholar 

  10. Cristianini, N., Shawe-Taylor, J.: An introduction to support vector machines and other kernelbased learning methods. Cambridge University Press (2000)

    Google Scholar 

  11. Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of simple performance measures for tuning svm hyperparameters. Neurocomputing 51, 41–59 (2003)

    Article  Google Scholar 

  12. Fröohlich, H., Chapelle, O., Schöolkopf, B.: Feature selection for support vector machines by means of genetic algorithms. In: Proceedings of the 15th IEEE International Conference on Tools with artificial Intelligence, pp. 142–148 (2003)

    Google Scholar 

  13. Goldberg, D.E.: Genetic Algorithms in Search, Optimization andMachine Learning. Addison- Wesley Longman Publishing Co., Inc, Boston, MA, USA (1989)

    Google Scholar 

  14. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  15. Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI, USA (1975)

    Google Scholar 

  16. Huang, H.L., Chang, F.L.: Esvm: Evolutionary support vector machine for automatic feature selection and classification of microarray data. BioSystems 90(2), 516–528 (2007)

    Article  MathSciNet  Google Scholar 

  17. Jia, L., Liao, S.: Combinatorial kernel matrix model selection using feature distances. In: Proceedings of International Conference on Intelligent Computation Technology and Automation, ICICTA 2008, vol. 1, pp. 40–43 (2008)

    Google Scholar 

  18. Joachims, T.: Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms. Kluwer Academic Publishers, Norwell, MA, USA (2002)

    Google Scholar 

  19. Kira, K., Rendell, L.A.: Feature selection problem: traditional methods and a new algorithm. In: Proceedings 10th National Conference on Artificial Intelligence - AAAI-92, pp. 129–134 (1992)

    Google Scholar 

  20. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2), 273–324 (1997)

    Article  MATH  Google Scholar 

  21. Nguyen, C.H., Ho, T.B.: An efficient kernel matrix evaluation measure. Pattern Recognition 41(11), 3366–3372 (2008)

    Article  MATH  Google Scholar 

  22. Rakotomamonjy, A.: Variable selection using svm-based criteria. Journal of Machine Learning Research 3, 1357–1370 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  23. Vapnik, V.: The nature of statistical learning theory. Springer-Verlag New York, Inc (1995)

    MATH  Google Scholar 

  24. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  25. Wu, C.H., Tzeng, G.H., Goo, Y.J., Fang, W.C.: A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Systems with Applications 32(2), 397–408 (2007)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alessandro Perolini .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag London Limited

About this paper

Cite this paper

Perolini, A. (2011). A Fast Approximated Evolutionary Approach to Improve SVM Accuracy. In: Bramer, M., Petridis, M., Hopgood, A. (eds) Research and Development in Intelligent Systems XXVII. SGAI 2010. Springer, London. https://doi.org/10.1007/978-0-85729-130-1_14

Download citation

  • DOI: https://doi.org/10.1007/978-0-85729-130-1_14

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-0-85729-129-5

  • Online ISBN: 978-0-85729-130-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics