Skip to main content
Log in

Positive definite dot product kernels in learning theory

  • Published:
Advances in Computational Mathematics Aims and scope Submit manuscript

Abstract

In the classical support vector machines, linear polynomials corresponding to the reproducing kernel K(x,y)=xy are used. In many models of learning theory, polynomial kernels K(x,y)=∑ Nl=0 a l (xy)l generating polynomials of degree N, and dot product kernels K(x,y)=∑ +∞l=0 a l (xy)l are involved. For corresponding learning algorithms, properties of these kernels need to be understood. In this paper, we consider their positive definiteness. A necessary and sufficient condition for the dot product kernel K to be positive definite is given. Generally, we present a characterization of a function f :RR such that the matrix [f(xixj)] mi,j=1 is positive semi-definite for any x1,x2,. . .,xmRn, n≥2.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. F. Cucker and S. Smale, On the mathematical foundations of learning, Bull. Amer. Math. Soc. 39 (2002) 1–49.

    Google Scholar 

  2. F. Cucker and S. Smale, Best choices for regularization parameters in learning theory: On the bias-variance problem, Fund. Comput. Math. 2 (2002) 413–428.

    Google Scholar 

  3. W. Dahmen and C.A. Micchelli, Some remarks on ridge functions, Approx. Theory Appl. 3 (1987) 139–143.

    Google Scholar 

  4. T. Evgeniou, T. Poggio and M. Pontil, Regularization networks and support vector machines, Adv. Comput. Math. 13 (2000) 1–50.

    Google Scholar 

  5. C.H. FitzGerald, C.A. Micchelli and A. Pinkus, Functions that preserve families of positive semi-definite matrices, Linear Algebra Appl. 221 (1995) 83–102.

    Google Scholar 

  6. S. Mendelson, l-norm and its application to learning theory, Positivity 5 (2001) 177–191.

    Google Scholar 

  7. V.A. Menegatto, Strictly positive definite kernels on the Hilbert sphere, Appl. Anal. 55 (1994) 91–101.

    Google Scholar 

  8. C.A. Micchelli, Interpolation of scattered data: distance matrices and conditionally positive definite functions, Constr. Approx. 2 (1986) 11–22.

    MATH  Google Scholar 

  9. C.A. Micchelli, P.W. Smith, J. Swetits and J.D. Ward, Constrained Lp approximation, Constr. Approx. 1 (1985) 93–102.

    Google Scholar 

  10. C.A. Micchelli and F.I. Utreras, Smoothing and interpolation in a convex subset of a Hilbert space, SIAM J. Sci. Statist. Comput. 9 (1988) 728–746.

    Google Scholar 

  11. C.A. Micchelli and F.I. Utreras, Smoothing and interpolation in a convex subset of a semi-Hilbert space, Math. Modelling Numer. Methods 4 (1991) 425–440.

    Google Scholar 

  12. C.A. Micchelli, Y. Xu and D. Ye, Cucker Smale learning theory in Besov spaces, in: Learning Theory and Practice (NATO, ASI, Leuven, 2002) to appear.

    Google Scholar 

  13. A. Pinkus, Strictly positive definite functions on a real inner product space, Adv. Comput. Math., to appear.

  14. T. Poggio, S. Mukherjee, R. Rifkin, A. Rakhlin and A. Verri, Preprint.

  15. S. Smale and D.X. Zhou, Estimating the approximation error in learning theory, Anal. Appl. 1 (2003) 1–25.

    Google Scholar 

  16. I. Steinwart, On the influence of the kernel on the consistency of support vector machines, J. Machine Learning Res. 2 (2001) 67–93.

    Google Scholar 

  17. V. Vapnik, Statistical Learning Theory (Wiley, New York, 1998).

    Google Scholar 

  18. G. Wahba, Spline Models for Observational Data (SIAM, Philadelphia, PA, 1990).

    Google Scholar 

  19. D. Widder, The Laplace Transform (Princeton Univ. Press, Princeton, NJ, 1946).

    Google Scholar 

  20. D.X. Zhou, The covering number of learning theory, J. Complexity 18 (2002) 739–767.

    Google Scholar 

  21. D.X. Zhou, Conditionally reproducing kernel spaces in learning theory, Preprint.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fangyan Lu.

Additional information

Communicated by C.A. Micchelli

Supported by CERG Grant No. CityU 1144/01P and City University of Hong Kong Grant No. 7001342.

AMS subject classification

42A82, 41A05

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lu, F., Sun, H. Positive definite dot product kernels in learning theory. Adv Comput Math 22, 181–198 (2005). https://doi.org/10.1007/s10444-004-3140-6

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10444-004-3140-6

Keywords

Navigation