Skip to main content

Part of the book series: Advanced Information and Knowledge Processing ((AI&KP))

  • 1605 Accesses

Abstract

Though support vector machine has been a promising tool in machine learning, but it does not directly obtain the feature importance. Identifying a subset of features which contribute most to classification is also an important task in classification. The benefit of feature selection is twofold. It leads to parsimonious models that are often preferred in many scientific problems, and it is also crucial for achieving good classification accuracy in the presence of redundant features. We can combine SVM with various feature selection strategies. Some of them are “filters”: general feature selection methods independent of SVM; on the other hand, some are wrapper-type methods: modifications of SVM which choose important features as well as conduct training/testing. In the machine learning literature, there are several proposals for feature selection to accomplish the goal of automatic feature selection in the SVM, in some of which they applied the l 0-norm, l 1-norm SVM and got competitive performance. We proposed two models in this chapter, l p -norm C-support vector classification (l p -SVC) and l p -norm proximal support vector machine (l p -PSVM), which separately combines C-SVC and PSVM with feature selection strategy by introducing the l p -norm (0<p<1).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bradley, P., Mangasarian, O.: Feature selection via concave minimization and support vector machines. In: International Conference on Machine Learning. Morgan Kaufmann, San Mateo (1998)

    Google Scholar 

  2. Candes, E., Wakin, M., Boyd, S.: Enhancing sparsity by reweighted l 1 minimization. J. Fourier Anal. Appl. 14, 877–905 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  3. Chen, X., Xu, F., Ye, Y.: Lower bound theory of nonzero entries in solutions of l 2l p minimization. Technical report, Department of Applied Mathematics, The Hong Kong Polytechnic University (2009)

    Google Scholar 

  4. Chen, X., Zhou, W.: Smoothing nonlinear conjugate gradient method for image restoration using nonsmooth nonconvex minimization. Preprint, Department of Applied Mathematics, The Hong Kong Polytechnic University (2008)

    Google Scholar 

  5. Friedman, J., Hastie, T., Rosset, S., Tibshirani, R., Zhu, J.: Discussion of “Consistency in boosting” by W. Jiang, G. Lugosi, N. Vayatis and T. Zhang. Ann. Stat. 32, 102–107 (2004)

    Google Scholar 

  6. Fung, G., Mangasarian, O.: Proximal support vector machine classifiers. In: Proceedings of International Conference of Knowledge Discovery and Data Mining, pp. 77–86 (2001)

    Google Scholar 

  7. Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46, 389–422 (2002)

    Article  MATH  Google Scholar 

  8. Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. In: Advances in Neural Information Processing Systems, vol. 13 (2001)

    Google Scholar 

  9. Zhu, J., Rosset, S., Hastie, T., Tibshirani, R.: 1-norm support vector machines. In: Advances in Neural Information Processing Systems, vol. 16 (2004)

    Google Scholar 

  10. Zou, H., Yuan, M.: The f norm support vector machine. Stat. Sin. 18, 379–398 (2008)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yong Shi .

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag London Limited

About this chapter

Cite this chapter

Shi, Y., Tian, Y., Kou, G., Peng, Y., Li, J. (2011). Feature Selection via l p -Norm Support Vector Machines. In: Optimization Based Data Mining: Theory and Applications. Advanced Information and Knowledge Processing. Springer, London. https://doi.org/10.1007/978-0-85729-504-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-0-85729-504-0_6

  • Publisher Name: Springer, London

  • Print ISBN: 978-0-85729-503-3

  • Online ISBN: 978-0-85729-504-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics