Skip to main content

Power System Database Feature Selection Using a Relaxed Perceptron Paradigm

  • Conference paper
MICAI 2006: Advances in Artificial Intelligence (MICAI 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4293))

Included in the following conference series:

Abstract

Feature selection has become a relevant and challenging problem for the area of knowledge discovery in database. An effective feature selection strategy can significantly reduce the data mining processing time, improve the predicted accuracy, and help to understand the induced models, as they tend to be smaller and make more sense to the user. In this paper, an effective research around the utilization of the Perceptron paradigm as a method for feature selection is carried out. The idea is training a Perceptron and then utilizing the interconnection weights as indicators of which attributes could be the most relevant. We assume that an interconnection weight close to zero indicates that the associated attribute to this weight can be eliminated because it does not contribute with relevant information in the construction of the class separator hyper-plane. The experiments were realized with 4 real and 11 synthetic databases. The results show that the proposed algorithm is a good trade-off among performance (generalization accuracy), efficiency (processing time) and feature reduction. Specifically, we apply the algorithm to a Mexican Electrical Billing database with satisfactory accuracy, efficiency and feature reduction results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 239.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of machine learning research 3, 1157–1182 (2003)

    Article  MATH  Google Scholar 

  2. Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence Journal, Special issue on relevance, 273–324 (1997)

    Google Scholar 

  3. Piramuthu, S.: Evaluating feature selection methods for learning in data mining applications. In: Proc. 31st annual Hawaii Int. conf. on system sciences, pp. 294–301 (1998)

    Google Scholar 

  4. Molina, L., Belanche, L., Nebot, A.: FS algorithms, a survey and experimental evaluation. In: IEEE Int. conf. data mining, Maebashi City Japan, pp. 306–313 (2002)

    Google Scholar 

  5. Mitra, S., et al.: Data mining in soft computing framework: a survey. IEEE Trans. on neural networks 13(1), 3–14 (2002)

    Article  Google Scholar 

  6. Jutten, C., Fambon, O.: Pruning methods: a review. In: European symposium on artificial neural networks, April 1995, pp. 129–140 (1995)

    Google Scholar 

  7. Brank, J., Grobelnik, M., Milic-Frayling, N., Mladenic, D.: Interaction of feature selection methods and linear classification models. In: Proceedings of the ICML 2002 Workshop on Text Learning, Sydney, AU (2002)

    Google Scholar 

  8. Gallant, S.I.: Perceptron-Based Learning Algorithms. IEEE Transactions on Neural Networks 1, 179–191 (1990)

    Article  Google Scholar 

  9. www.cs.waikato.ac.nz/ml/weka (2004)

  10. Agrawal, R., Imielinski, T., Swami, A.: Database mining: a performance perspective. IEEE Trans. Knowledge data engrg. 5(6), 914–925 (1993)

    Article  Google Scholar 

  11. Yu, L., Liu, H.: Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research 5, 1205–1224 (2004)

    MathSciNet  Google Scholar 

  12. Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  13. www.ia.uned.es/~elvira/ (2004)

  14. Stoppiglia, H., Dreyfus, G., et al.: Ranking a random feature for variable and feature selection. Journal of machine learning research 3, 1399–1414 (2003)

    Article  MATH  Google Scholar 

  15. Chen, Y., Lin, C.: Combining SVMs with various feature selection strategies. In: Guyon, I. (ed.) Feature extraction, foundations and applications (to appear, 2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mejía-Lavalle, M., Arroyo-Figueroa, G. (2006). Power System Database Feature Selection Using a Relaxed Perceptron Paradigm. In: Gelbukh, A., Reyes-Garcia, C.A. (eds) MICAI 2006: Advances in Artificial Intelligence. MICAI 2006. Lecture Notes in Computer Science(), vol 4293. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11925231_49

Download citation

  • DOI: https://doi.org/10.1007/11925231_49

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-49026-5

  • Online ISBN: 978-3-540-49058-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics