Skip to main content
Log in

Properties of the solution of L2-Support Vector Machine as a function of regularization parameter

  • Mathematical Theory of Pattern Recognition
  • Published:
Pattern Recognition and Image Analysis Aims and scope Submit manuscript

Abstract

The goal of this paper is to study some mathematical properties of so-called L2 Soft Margin Support Vector Machines (L2-SVMs) for data classification. Their dual formulations build a family of quadratic programming problems depending on one regularization parameter. The dependence of the solution on this parameter is examined. Such properties as continuity, differentiability, monotony, convexity and structure of the solution are investigated. It is shown that the solution and the objective value of the Hard Margin SVM allow estimating the slack variables of the L2-SVMs. Most results deal with the dual problem, but some statements about the primal problem are also formulated (e.g., the behavior and differentiability of slack variables). An ancillary dual problem is used as investigation tool. It is shown that it is in reality a dual formulation of a quasi identical L2-SVM primal.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S. Abe, “Analysis of Support Vector Machines,” in Proc. 12th IEEE Workshop for Signal Processing Society: Neural Networks for Signal Processing (Martigny, 2002), pp. 89–98.

  2. S. Abe, Support Vector Machines for Pattern Classification (Springer-Verlag, London, 2005).

    Google Scholar 

  3. P. L. Bartlett and A. Tewari, “Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results,” J. Mach. Learn. Res., No. 8, 775–790 (2007).

    Google Scholar 

  4. J. F. Bonnans and A. Shapiro, “Optimization Problems with Perturbations, A Guided Tour,” SIAM Rev. 40(2), 228–264 (1998).

    Article  MATH  MathSciNet  Google Scholar 

  5. J. F. Bonnans and A. Shapiro, Perturbation Analysis of Optimization Problems (Springer-Verlag, New York, 2000).

    Book  MATH  Google Scholar 

  6. C. J. C. Burges and D. J. Crisp, “Uniqueness Theorems for Kernel Methods,” Neurocomput. 55(1–2), 187–220 (2003).

    Article  Google Scholar 

  7. C. Cortes and V. Vapnik, “Support-Vector Networks,” Mach. Learn. 20, 273–297 (1995).

    MATH  Google Scholar 

  8. C.-C. Chang and C.-J. Lin, “Training ν-Support Vector Classifiers: Theory and Algorithms,” Neural Comput. 13(9), 2119–2147 (2001).

    Article  MATH  Google Scholar 

  9. K.-M. Chung, W.-C. Kao, C.-L. Sun, L.-L. Wang, and C.-J. Lin, “Radius Margin Bounds for Support Vector Machines with the RBF Kernel,” Neural Comput. 15(11), 2643–2681 (2003).

    Article  MATH  Google Scholar 

  10. N. Cristiani and J. Shawe-Taylor, An Introduction to Support Vector Machine and Other Kernel-Based Learning Methods (Univ. Press, Cambridge, 2000).

    Book  Google Scholar 

  11. L. Doktorski, “Dual Problem to the L2-SVM: Dependence on the Regularization Parameter,” in Proc. PRIA-9-2008 (Lobachevsky State University, Nizhni Novgorod, 2008), Vol. 1, pp 93–96.

    Google Scholar 

  12. S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy, “A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design,” IEEE Trans. Neural Networks 11(1), 124–136 (2000).

    Article  Google Scholar 

  13. S. S. Keerthi, and C.-J. Lin, “Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel,” Neural Comput. 15(7), 1667–1689 (2003).

    Article  MATH  Google Scholar 

  14. C.-J. Lin, “Formulations of Support Vector Machines: a Note from an Optimization Point of View,” Neural Comput. 13(2), 307–317 (2001).

    Article  MATH  Google Scholar 

  15. M. Pontil and A. Verri, “Properties of Support Vector Machines,” Neural Comput. 10(4), 955–974 (1998).

    Article  Google Scholar 

  16. B. Schölkopf and A. J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (MIT Press, London, 2002).

    Google Scholar 

  17. V. N. Vapnik, Statistical Learning Theory (John Wiley & Sons, New York, 1998).

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to L. Doktorski.

Additional information

The article is published in the original.

Leo Doktorski. Born 1952. Received diploma in Mathematics from the Rostov-on-Don State University in 1974 and Dr. rer. nat. (Kandidat Nauk) degree also from the Rostov-on-Don State University in 1978. He works as researcher in the IOSB-Fraunhofer in Ettlingen, Germany. He has published more than 50 papers in various journals, conferences and workshops.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Doktorski, L. Properties of the solution of L2-Support Vector Machine as a function of regularization parameter. Pattern Recognit. Image Anal. 22, 121–130 (2012). https://doi.org/10.1134/S1054661812010129

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1054661812010129

Keywords

Navigation