Skip to main content

A Two-Stage Decision Tree Algorithm on Constructing Hyper-Plane

  • Conference paper
Applied Informatics and Communication (ICAIC 2011)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 225))

Included in the following conference series:

Abstract

How to construct the “appropriate” split hyper-plane in test nodes is the key of building decision trees. In this paper, we re-explain the process of building test nodes in terms of geometry. Based on this, we propose a method of learning the hyper-plane with two stages. The first stage searches for appropriate normal direction based on unsupervised methods (e.g. PCA, ICA etc), or supervised methods (e.g. neural network). The second stage detects the intercept of the hyper-plane in the normal direction according to some criterions, such as Gini of CART and information gain ratio (GainRatio) of C4.5. The experimental results conform that TSDT can improve the accuracy of univariate trees and that it is quite comparable to functional tree proposed by Gama, J. 2004, yet, needing much less learning time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. wSu, X.G., Tsai, C.-L., Wang, C.: Tree-structured model diagnostics for linear regression. Mach. Learn. 74, 111–131 (2009)

    Article  Google Scholar 

  2. Vens, C., Struyf, J., Schietgat, L., Dzeroski, S., Blockeel, H.: Decision trees for hierarchical multi-label classification. Mach. Learn. 73, 185–214 (2008)

    Article  Google Scholar 

  3. Quinlan, J.R.: Discovering rules by induction from large collection of examples. In: Michie, D. (ed.) Expert systems in the Micro Electronic Age, Edinburgh University Press, Edinburgh (1979)

    Google Scholar 

  4. Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  5. Breiman, L., Friedman, J.H., Olshen, R., Stone, C.J.: Classification and Regression Trees. Chapman & Hall, New York (1984)

    MATH  Google Scholar 

  6. Brodley, C.E., Utgoff, P.E.: Multivariate decision trees. Machine Learning 19, 45–77 (1995)

    MATH  Google Scholar 

  7. Nilsson, N.J.: Learning machines. McGraw-Hill, New York (1965)

    MATH  Google Scholar 

  8. Duda, R.O., Hart, P.E.: Pattern classification and scene analysis. Wiley & Sons, New York (1973)

    MATH  Google Scholar 

  9. Utgoff, P.E., Brodley, C.E.: An incremental method for finding multivariate splits for decision trees. In: Proceedings of the Seventh International Conference on Machine Learning, pp. 58–65. Morgan Kaufmann, Austin (1990)

    Google Scholar 

  10. Utgoff, P.E., Brodley, C.E.: Linear machine decision trees (COINS Technical Report 91-10), Amherst, MA: University of Massachusetts, Department of Computer and Information Science (1991)

    Google Scholar 

  11. Gama, J.: Probabilistic linear tree. In: Fisher, D. (ed.) Proc. of the 14th International Conference on Machine Learning, pp. 134–142. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  12. Gama, J.: Functional Trees. Machine Learning 55, 219–250 (2004)

    Article  MATH  Google Scholar 

  13. Murthy, S., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–32 (1994)

    MATH  Google Scholar 

  14. Liu, H., Setiono, R.: Feature Transformation and Multivariate Decision Tree Induction. Discovery Science. Springer, Heidelberg (1998)

    Book  Google Scholar 

  15. Zhong, M.Y., Georgiopoulos, M., Anagnostopoulos, G.: A k-norm pruning algorithm for decision tree classifiers based on error rate estimation. Mach. Learning 71, 55–88 (2008)

    Article  Google Scholar 

  16. Blockeel, H., De Raedt, L., Ramon, J.: Top-down induction of clustering trees. In: Proceedings of the 15th International Conference on Machine Learning, pp. 55–63 (1998)

    Google Scholar 

  17. Rodriguez, J., Kuncheva, L.: Rotation Forest: A New Classifier Ensemble Method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1613–1619 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, S., She, W., Wang, M., Duan, Z. (2011). A Two-Stage Decision Tree Algorithm on Constructing Hyper-Plane. In: Zeng, D. (eds) Applied Informatics and Communication. ICAIC 2011. Communications in Computer and Information Science, vol 225. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23220-6_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23220-6_40

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23219-0

  • Online ISBN: 978-3-642-23220-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics