Skip to main content

An Evolutionary and Attribute-Oriented Ensemble Classifier

  • Conference paper
Computational Science and Its Applications - ICCSA 2006 (ICCSA 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3981))

Included in the following conference series:

Abstract

In the research area of decision tree, numerous researchers have been focusing on improving the predictive accuracy. However, obvious improvement can hardly be made until the introduction of the ensemble classifier. In this paper, we propose an Evolutionary Attribute-Oriented Ensemble Classifier (EAOEC) to improve the accuracy of sub-classifiers and at the same time maintain the diversity among them. EAOEC uses the idea of evolution to choose proper attribute subset for the building of every sub-classifier. To avoid the huge computation cost for the evolution, EAOEC uses the gini value gained during the construction of a sub-tree as the evolution basis to build the next sub-tree. Eventually, EAOEC classifier uses uniform weight voting to combine all sub-classifiers and experiments show that EAOEC can efficiently improve the predictive accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees, Wadsworth (1984)

    Google Scholar 

  • Quinlan, J.R.: C4.5: Program for Machine Learning. Morgen Kaufmann Publisher, San Mateo (1993)

    Google Scholar 

  • Mehta, M., Agrawal, R., Rissanen, J.: SLIQ: A Fast Scalable Classifier for Data Mining. In: EDBT, pp. 18–32 (1996)

    Google Scholar 

  • Breiman, L.: Bagging Predictors. Machine Learning 24(1), 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  • Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: ICML, pp. 148–156 (1996)

    Google Scholar 

  • Kuncheva, L., Whitaker, C.J.: Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy. Machine Learning 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  • Breiman, L.: Random Forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  • Wolpert, D., Macready, W.G.: An Efficient Method To Estimate Bagging’s Generalization Error. Machine Learning 35(1), 41–55 (1999)

    Article  MATH  Google Scholar 

  • Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  • Guerra-Salcedo, C., Whitley, D.: Genetic Approach to Feature Selection for Ensemble Creation. In: GECCO, pp. 236–243 (1999)

    Google Scholar 

  • Opitz, D.W.: Feature Selection for Ensembles. In: AAAI/IAAI, pp. 379–384 (1999)

    Google Scholar 

  • Rastogi, R., Shim, K.: PUBLIC: A Decision Tree Classifier that Integrates Building and Pruning. Data Mining and Knowledge Discovery 4(4), 315–344 (2000)

    Article  MATH  Google Scholar 

  • Ho, T.K.: Random decision forests. In: ICDAR, pp. 278–292 (1995)

    Google Scholar 

  • Cunningham, P., Carney, J.: Diversity versus Quality in Classification Ensembles Based on Feature Selection. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 109–116. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  • Mehta, M., Rissanen, J., Agrawal, R.: MDL-Based Decision Tree Pruning. In: KDD, pp. 216–221 (1995)

    Google Scholar 

  • Kuncheva, L., Whitaker, C.J., Shipp, C.A., Duin, R.P.W.: Is Independence Good For Combining Classifiers? In: ICPR, pp. 2168–2171 (2000)

    Google Scholar 

  • Shipp, C.A., Kuncheva, L.: Relationships Between Combination Methods And Measures of Diversity In Combining Classifiers. Information Fusion 3(2), 135–148 (2002)

    Article  Google Scholar 

  • Rokach, L.: Ensemble Methods for Classifiers. The Data Mining and Knowledge Discovery Handbook 2005, pp. 957–980 (2005)

    Google Scholar 

  • Windeatt, T.: Diversity/Accuracy and Ensemble Classifier Design. In: ICPR(3), pp. 454–457 (2004)

    Google Scholar 

  • Melville, P., Mooney, R.J.: Diverse ensembles for active learning. In: ICML (2004)

    Google Scholar 

  • Xie, Z., Zhang, Q., Hsu, W., Lee, M.L.: Enhancing SNNB with Local Accuracy Estimation and Ensemble Techniques. In: Zhou, L.-z., Ooi, B.-C., Meng, X. (eds.) DASFAA 2005. LNCS, vol. 3453, pp. 523–535. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lee, CI., Tsai, CJ., Ku, CW. (2006). An Evolutionary and Attribute-Oriented Ensemble Classifier. In: Gavrilova, M.L., et al. Computational Science and Its Applications - ICCSA 2006. ICCSA 2006. Lecture Notes in Computer Science, vol 3981. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11751588_128

Download citation

  • DOI: https://doi.org/10.1007/11751588_128

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34072-0

  • Online ISBN: 978-3-540-34074-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics