Skip to main content

Kernel Parameter Optimization for KFDA Based on the Maximum Margin Criterion

  • Conference paper
  • First Online:
Advances in Neural Networks – ISNN 2014 (ISNN 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8866))

Included in the following conference series:

  • 4187 Accesses

Abstract

Kernel parameters optimization is one of the most challenging problems on kernel Fisher discriminant analysis (KFDA). In this paper, a simple and effective KFDA kernel parameters optimization criterion is proposed on the basis of the maximum margin criterion (MMC) that maximize the distances between any two classes. Actually, this MMC-based criterion is applied to the kernel parameters optimization on KFDA and KFDA with Locally Linear Embedding affinity matrix (KFDA-LLE). It is demonstrated by the experiments on six real-world multiclass datasets that, in comparison with two other criteria, our MMC-based criterion can detect the optimal KFDA kernel parameters more accurately in the cases of both RBF kernel and polynomial kernel.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fukunnaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, Boston (1990)

    Google Scholar 

  2. Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Computation 12, 2385–2404 (2000)

    Article  Google Scholar 

  3. Müller, K., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An Introduction to Kernel-Based Learning Algorithms. IEEE Transactions on Neural Networks 12(2), 181–201 (2001)

    Article  Google Scholar 

  4. Tenenbaum, J.B., Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

  5. Taylor, J.S., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, London (2004)

    Book  Google Scholar 

  6. Liu, J., Zhao, F., Liu, Y.: Learning Kernel Parameters for Kernel Fisher Discriminant Analysis. Pattern Recognition Letters 34, 1026–1031 (2013)

    Article  Google Scholar 

  7. Huang, J., Chen, X., et al.: Kernel Parameter Optimization for Kernel-based LDA methods. In: International Joint Conference on Neural Networks, pp. 3840–3846. IEEE Press, Hong Kong (2008)

    Google Scholar 

  8. Hartigan, J.A.: Clustering Algorithms. Wiley, New York (1975)

    MATH  Google Scholar 

  9. Millgan, G., Cooper, M.: An Examination of Procedures for Determining The Number of Clusters in A Data Set. Pyschometrika 50(2), 159–179 (1985)

    Article  Google Scholar 

  10. Friedman, H.P., Rubin, J.: On Some Invariant Criteria for Grouping Data. Journal of the American Statistical Association 62, 1159–1178 (1967)

    Article  MathSciNet  Google Scholar 

  11. Li, H., Jiang, T., Zhang, K.: Efficient and Robust Feature Extraction by Maximum Margin Criterion. IEEE Transactions on Neural Networkd 17(1), 157–165 (2006)

    Article  Google Scholar 

  12. Sugiyama, M.: Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis. Journal of Machine Learning Research 8, 1027–1061 (2007)

    MATH  Google Scholar 

  13. Zhao, Y., Ma, J.: Local Fisher Discriminant Analysis with Locally Linear Embedding Affinity Matrix. In: Guo, C., Hou, Z.-G., Zeng, Z. (eds.) ISNN 2013, Part I. LNCS, vol. 7951, pp. 471–478. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  14. Orloci, L.: An Agglomerative Method for Classification of Plant Communities. Journal of Ecology 55, 193–206 (1967)

    Article  Google Scholar 

  15. UCI Machine Learning Repository, http://mlearn.ics.uci.edu/databases

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinwen Ma .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Zhao, Y., Ma, J. (2014). Kernel Parameter Optimization for KFDA Based on the Maximum Margin Criterion. In: Zeng, Z., Li, Y., King, I. (eds) Advances in Neural Networks – ISNN 2014. ISNN 2014. Lecture Notes in Computer Science(), vol 8866. Springer, Cham. https://doi.org/10.1007/978-3-319-12436-0_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12436-0_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12435-3

  • Online ISBN: 978-3-319-12436-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics