Abstract
Feature selection has received a lot of attention in the machine learning community, but mainly under the supervised paradigm. In this work we study the potential benefits of feature selection in hierarchical clustering tasks. Particularly we address this problem in the context of incremental clustering, following the basic ideas of Gennari [8]. By using a simple implementation, we show that a feature selection scheme running in parallel with the learning process can improve the clustering task under the dimensions of accuracy, efficiency in learning, efficiency in prediction and comprehensibility.
Chapter PDF
Similar content being viewed by others
References
A. L. Blum and P. Langley. Selection of relevant features and examples in machine learning. Artificial Intelligence, 97:245–271, 1997. 392, 394
M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1(3), 1997. 392
M. Dash, H. Liu, and J. Yao. Dimensionality reduction for unsupervised data. In Ninth IEEE International Conference on Tools with AI, ICTAI’97, 1997. 401
M. Devaney and A. Ram. Efficient feature selection in conceptual clustering. In Machine Learning: Proceedings of the Fourteenth Intern ational Conference, Nashville, TN, 1997. 401
D. Fisher, L. Xu, J. Carnes, Y. Reich, S. Fenves, J. Chen, R. Shiavi, G. Biswas, and J. Weinberg. Applying ai clustering to engineering tasks. IEEE Expert, 8:51–60, 1993. 400
D. H. Fisher. Knowledge acquisition via incremental conceptual clustering. Machine Learning, 2:139–172, 1987. 393
D. H. Fisher. Iterative optimization and simplification of hierarchical clusterings. Journal of Artificial Intelligence Research, 4:147–179, 1996. 402
J. H. Gennari. Concept formation and attention. In Proceedings of the Seventh Annual Conference of the Cognitive Science Society, pages 724–728, Irvine, CA, 1991. Lawrence Erlbaum Associates. 392, 398, 400
R. Kohavi and G. H. John. Wrappers for feature subset selection. Artificial Intelligence, 97:273–324, 1997. 394, 397
J. L. Kolodner. Reconstructive memory: A computer model. Cognitive Science, 7:281–328, 1983. 400
P. Langley. Elements of machine learning. Morgan Kaufmann, San Francisco, CA, 1995. 393
M. Lebowitz. Experiments with incremental concept formation: UNIMEM. Machine Learning, 2:103–138, 1987. 400
L. Talavera. Dependency-based feature selection for symbolic clustering. Intelligent Data Analysis. To appear. 400, 401
L. Talavera. Feature selection as a preprocessing step for hierarchical clustering. In Proceedings of the Sixteenth International Conference on Machine Learning, Bled, Slovenia, 1999. Morgan Kaufmann. 395, 401
L. Talavera. Feature selection as retrospective pruning in hierarchical clustering. In Third International Symposium on Intelligent Data Analysis, IDA 99, volume 1642 of Lecture Notes in Computer Science, Amsterdam, The Netherlands, 1999. Springer Verlag. 395, 400, 401
J. J. Furtado Vasco. Determining property relevance in concept formation by computing correlation between properties. In Proceedings of the Tenth European Conference on Machine Learning, ECML98, volume 1398 of Lecture Notes in Artificial Intelligence, pages 310–315, Chemnitz, Germany, 1998. Springer Verlag. 401
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Talavera, L. (2000). Dynamic Feature Selection in Incremental Hierarchical Clustering. In: López de Mántaras, R., Plaza, E. (eds) Machine Learning: ECML 2000. ECML 2000. Lecture Notes in Computer Science(), vol 1810. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45164-1_40
Download citation
DOI: https://doi.org/10.1007/3-540-45164-1_40
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67602-7
Online ISBN: 978-3-540-45164-8
eBook Packages: Springer Book Archive