Abstract
This work presents an approach to automatically induction for non-greedy decision trees constructed from neural network architecture. This construction can be used to transfer weights when growing or pruning a decision tree, allowing non-greedy decision tree algorithms to automatically learn and adapt to the ideal architecture. In this work, we examine the underpinning ideas within ensemble modelling and Bayesian model averaging which allow our neural network to asymptotically approach the ideal architecture through weights transfer. Experimental results demonstrate that this approach improves models over fixed set of hyperparameters for decision tree models and decision forest models.
This is a pre-print of a contribution “Chapman Siu, Automatic Induction of Neural Network Decision Tree Algorithms”. To appear in Computing Conference 2019 Proceedings. Advances in Intelligent Systems and Computing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics. Springer, Heidelberg (2006)
Chung, J., Ahn, S., Bengio, Y.: Hierarchical multiscale recurrent neural networks. In: ICLR (2017)
Hastie, D.I., Green, P.J.: Model choice using reversible jump Markov chain Monte Carlo. Statistica Neerlandica 66(3), 309–338 (2012)
Huang, F., Ash, J.T., Langford, J., Schapire, R.E.: Learning deep resnet blocks sequentially using boosting theory. In: International Conference of Machine Learning 2018, vol. abs/1706.04964 (2018)
Le, T., Clarke, B.: On the interpretation of ensemble classifiers in terms of Bayes classifiers. J. Classifi. 35(2), 198–229 (2018)
Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. J. Artif. Int. Res. 2(1), 1–32 (1994)
Norouzi, M., Collins, M., Johnson, M.A., Fleet, D.J. Kohli, P.: Efficient non-greedy optimization of decision trees. In: Advances in Neural Information Processing Systems (2015)
Kontschieder, P., Fiterau, M., Criminisi, A., Bulo, S.R.: Deep neural decision forests. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, IJCAI 2016, New York, NY, USA, 9–15 July 2016, pp. 4190–4194 (2016)
Veit, A., Wilber, M.J., Belongie, S.: Residual networks behave like ensembles of relatively shallow networks. In: Advances in Neural Information Processing Systems (2016)
Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Siu, C. (2019). Automatic Induction of Neural Network Decision Tree Algorithms. In: Arai, K., Bhatia, R., Kapoor, S. (eds) Intelligent Computing. CompCom 2019. Advances in Intelligent Systems and Computing, vol 997. Springer, Cham. https://doi.org/10.1007/978-3-030-22871-2_48
Download citation
DOI: https://doi.org/10.1007/978-3-030-22871-2_48
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22870-5
Online ISBN: 978-3-030-22871-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)