Abstract
In recent years, together with bagging [5] and the random subspace method [15], boosting [6] became one of the most popular combining techniques that allows us to improve a weak classifier. Usually, boosting is applied to Decision Trees (DT’s). In this paper, we study boosting in Linear Discriminant Analysis (LDA). Simulation studies, carried out for one artificial data set and two real data sets, show that boosting might be useful in LDA for large training sample sizes while bagging is useful for critical training sample sizes [11]. In this paper, in contrast to a common opinion, we demonstrate that the usefulness of boosting does not depend on the instability of a classifier.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jain, A.K., Chandrasekaran, B.: Dimensionality and Sample Size Considerations in Pattern Recognition Practice. In: Krishnaiah, P.R., Kanal, L.N. (eds.): Handbook of Statistics, Vol. 2. North-Holland, Amsterdam (1987) 835–855
Friedman, J.H.: Regularized Discriminant Analysis. JASA 84 (1989) 165–175
An, G.: The Effects of Adding Noise During Backpropagation Training on a Generalization Performance. Neural Computation 8 (1996) 643–674
Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman and Hall, New York (1993)
Breiman, L.: Bagging predictors. Machine Learning Journal 24(2) (1996) 123–140
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Machine Learning: Proceedings of the Thirteenth International Conference (1996) 148–156
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.: Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5) (1998) 1651–1686
Breiman, L.: Arcing Classifiers. Annals of Statistics, 26(3) (1998) 801–849
Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. Technical Report (1999)
Dietterich, T.G.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, to appear
Skurichina, M., Duin, R.P.W.: Bagging for Linear Classifiers. Pattern Recognition 31(7) (1998) 909–930
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press (1990) 400–407
Cortes, C., Vapnik, V.: Support-Vector Networks. Machine Learning 20 (1995) 273–297
Blake, C.L. & Merz, C.J. (1998). UCI Repository of machine learning databases http://www.ics.uci.edu/~mlearn/MLRepository.html. Irvine,CA: University of California, Department of Information and Computer Science
Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8) 1998 832–844
Avnimelech, R., Intrator, N.: Boosted Mixture of Experts: An Ensemble Learning Scheme. Neural Computation 11 (1999) 483–497
Skurichina, M., Duin, R.P.W.: The Role of Combining Rules in Bagging and Boosting. Submitted to S+SSPR 2000, Alicante, Spain
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Skurichina, M., Duin, R.P.W. (2000). Boosting in Linear Discriminant Analysis. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_18
Download citation
DOI: https://doi.org/10.1007/3-540-45014-9_18
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67704-8
Online ISBN: 978-3-540-45014-6
eBook Packages: Springer Book Archive