Skip to main content

Combining Multiple k-Nearest Neighbor Classifiers Using Different Distance Functions

  • Conference paper
Intelligent Data Engineering and Automated Learning – IDEAL 2004 (IDEAL 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3177))

Abstract

The k-nearest neighbor (KNN) classification is a simple and effective classification approach. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classifiers. In this paper we present a new approach to combine multiple KNN classifiers based on different distance funtions, in which we apply multiple distance functions to improve the performance of the k-nearest neighbor classifier. The proposed algorithm seeks to increase generalization accuracy when compared to the basic k-nearest neighbor algorithm. Experiments have been conducted on some benchmark datasets from the UCI Machine Learning Repository. The results show that the proposed algorithm improves the performance of the k-nearest neighbor classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bay, S.D.: Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets. Intelligent Data Analysis 3(3), 191–209 (1999)

    Article  Google Scholar 

  2. Bao, Y., Du, X., Ishii, N.: Combining Feature Selection with Feature Weighting for k-NN Classifier. In: Yin, H., Allinson, N.M., Freeman, R., Keane, J.A., Hubbard, S. (eds.) IDEAL 2002. LNCS, vol. 2412, pp. 461–468. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  3. Bao, Y., Ishii, N.: Combining multiple k-Nearest Neighbor Classifiers for Text Classification by Reducts. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS, vol. 2534, pp. 361–368. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  4. Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)

    Article  MATH  Google Scholar 

  5. Itqon, Kaneko, S., Igarashi, S.: Combining Multiple k-Nearest Neighbor Classifiers Using Feature Combinations. Journal IECI 2(3), 23–319 (2000)

    Google Scholar 

  6. Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases, Irvine, CA: University of California Irvine, Department of Information and Computer Science (1998), Internet: http://www.ics.uci.edu/mlearn/MLRepository.html

  7. Stanfill, C., Waltz, D.: Toward memory-based reasoning. Communications of the ACM 29, 1213–1228 (1986)

    Article  Google Scholar 

  8. Tapia, R.A., Thompson, J.R.: Nonparametric Probability Density Estimation. The Johns Hopkins University Press, Baltimore (1978)

    MATH  Google Scholar 

  9. Wilson, D.R., Martinez, T.R.: Improved Heterogeneous Distance Functions. Journal of Artificial Intelligence Research 6(1), 1–34 (1997)

    MATH  MathSciNet  Google Scholar 

  10. Wilson, D.R., Martinez, T.R.: An Integrated Instance-Based Learning Algorithm. Computational Intelligence 16(1), 1–28 (2000)

    Article  MathSciNet  Google Scholar 

  11. Wilson, D.R., Martinez, T.R.: Reduction Techniques for Instance-Based Learning Algorithms. Machine Learning 38(3), 257–280 (2000)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bao, Y., Ishii, N., Du, X. (2004). Combining Multiple k-Nearest Neighbor Classifiers Using Different Distance Functions. In: Yang, Z.R., Yin, H., Everson, R.M. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2004. IDEAL 2004. Lecture Notes in Computer Science, vol 3177. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28651-6_93

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-28651-6_93

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22881-3

  • Online ISBN: 978-3-540-28651-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics