Abstract
This paper addresses the problem of feature subset selection for classification tasks. In particular, it focuses on the initial stages of complex realworld classification tasks when feature interaction is expected but illunderstood, and noise contaminating actual feature vectors must be expected to further complicate the classification problem. A neural-network based featureranking technique, the ‘clamping’ technique, is proposed as a robust and effective basis for feature selection that is more efficient than the established comparable techniques of sequential floating searches. The efficiency gain is that of an Order(n) algorithm over the Order(n 2) floating search techniques. These claims are supported by an empirical study of a complex classification task.
now with Department of Computer Science, University of Bradford, UK
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jain, A. and Zongker, D Feature selection: evaluation, application, and small sample performance, IEEE Trans. on Pattern Analysis and Machine Intelligence, 1997, 19(2), 153–158
MacKay D. A practical bayesian framework for backpropagation networks. Neural Computation 1992; 4:448–472
Wang, W., Jones, P., and Partridge, D. A comparative study of feature salience ranking techniques, Neural Computation, 2000 (in press).
Giacinto G, Roli F. Dynamic classifier selection. In: Kittler J, Roli F (eds) Multiple classifier systems. Springer, Berlin, 2000, pp. 177–189 (Lecture notes in computer science no. 1857)
Jain A, Chandrasekaran B. Dimensionality and sample size considerations. In: Krishnaiah P R, Kanal L N (eds.) Pattern Recognition in Practice. North Holland, 1982, vol. 2, chap. 39, pp. 835–855
Theodoridis S, Koutroumbas K. Pattern Recognition, Academic Press, San Diego, 1999
Pudil P, Novovicova J, Kittler J. Floating search methods in feature selection, Pattern Recognition Letters 1994, 15: 1119–1125
Mao J, Mohiuddin K, Jain A K. Parsimonious network design and feature selection through node pruning, Proc. 12th ICPR, Jerusalem, pp. 622–624, 1994
Rumelhart D E, Hinton G E, Williams R J. Learning internal representations by error propagation. In: Rumelhart D E, McClelland J L (eds) Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge, Mass., 1986, pp. 318–362
Wang, W., Jones, P., and Partridge, D. Assessing the impact of input features in a feedforward neural network. Neural Computing & Applications 2000: 9(2): 101–112.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Partridge, D., Wang, W., Jones, P. (2001). Efficient and Effective Feature Selection in the Presence of Feature Interaction and Noise. In: Singh, S., Murshed, N., Kropatsch, W. (eds) Advances in Pattern Recognition — ICAPR 2001. ICAPR 2001. Lecture Notes in Computer Science, vol 2013. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44732-6_20
Download citation
DOI: https://doi.org/10.1007/3-540-44732-6_20
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41767-5
Online ISBN: 978-3-540-44732-0
eBook Packages: Springer Book Archive