Skip to main content

Evolutionary Design of Neural Networks for Classification and Regression

  • Conference paper
Adaptive and Natural Computing Algorithms

Abstract

The Multilayer Perceptrons (MLPs) are the most popular class of Neural Networks. When applying MLPs, the search for the ideal architecture is a crucial task, since it should should be complex enough to learn the input/output mapping, without overfitting the training data. Under this context, the use of Evolutionary Computation makes a promising global search approach for model selection. On the other hand, ensembles (combinations of models) have been boosting the performance of several Machine Learning (ML) algorithms. In this work, a novel evolutionary technique for MLP design is presented, being also used an ensemble based approach. A set of real world classification and regression tasks was used to test this strategy, comparing it with a heuristic model selection, as well as with other ML algorithms. The results favour the evolutionary MLP ensemble method.

This work was supported by the FCT project POSI/ROBO/43904/2002, which is partially funded by FEDER.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Haykin, S. (1999) Neural Networks-A Compreen-sive Foundation. 2nd ed. Prentice-Hall

    Google Scholar 

  2. Quinlan, J.R. (1994) Comparing Connectionist and Symbolic Learning Methods. In: Hanson, S. et al. (eds.) Computation Learning: Theory and Natural Learning Systems, MIT Press, pp. 445–456

    Google Scholar 

  3. Setiono, R. (2003) Techniques for Extracting Classification and Regression Rules from Artificial Neural Networks. In D. Fogel and C. Robinson, (eds.), Computational Intelligence: The Experts Speak, IEEE Press/Wiley, pp 99–114

    Google Scholar 

  4. Thimm, G. and Fiesler, E. (1995) Evaluating pruning methods. In: Proc. of the Int. Symp. on Artificial Neural Networks, pp 20–25

    Google Scholar 

  5. Kwok, T. and Yeung, D. (1997) Constructive algorithms for structure learning in feedforward neural networks for regression problems problems: A survey. IEEE Transactions on Neural Networks, 8(3):630–645

    Article  Google Scholar 

  6. Yao, X. (1999) Evolving Artificial Neural Networks. In: Proc. of the IEEE, 87(9): 1423–1447

    Article  Google Scholar 

  7. Dietterich, T. (1997) Machine Learning Research: Four Current Directions. AI Magazine, 18(4):97–136

    Google Scholar 

  8. Rocha, M., Cortez, P. and Neves, J. Ensembles of Artificial Neural Networks with Heterogeneous Topologies. In: Proc. of the 4th Symposium on Engineering of Intelligent Systems (EIS2004). ICSC Academic Press

    Google Scholar 

  9. Blake, C. and Merz, C. (1998) UCI Repository of Machine Learning Databases, University of California

    Google Scholar 

  10. Cortez, P., Rocha, M. and Neves, J. (2001) Evolving Time Series Forecasting Neural Network Models. In: Proc. of the 3rd Int. Symposium on Adaptive Systems: Evolutionary Computation and Probabilistic Graphical Models (ISAS 2001), pp. 84–91

    Google Scholar 

  11. Riedmiller, M. (1994) Supervised Learning in Multilayer Perceptrons-from Backpropagation to Adaptive Learning Techniques. Computer Standards and Interfaces 16

    Google Scholar 

  12. Witten, I. and Frank, E. (2000) Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann

    Google Scholar 

  13. Kohavi, R. (1995) A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection. In: Proc. of the Int. Joint Conference on Artificial Intelligence (IJCAI)

    Google Scholar 

  14. Liu, Y., Yao, X. and Higuchi, T. (2000) Evolutionary Ensembles with Negative Correlation Learning. IEEE Transactions on Evolutionary Computation, 4(4):380–387

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag/Wien

About this paper

Cite this paper

Rocha, M., Cortez, P., Neves, J. (2005). Evolutionary Design of Neural Networks for Classification and Regression. In: Ribeiro, B., Albrecht, R.F., Dobnikar, A., Pearson, D.W., Steele, N.C. (eds) Adaptive and Natural Computing Algorithms. Springer, Vienna. https://doi.org/10.1007/3-211-27389-1_73

Download citation

  • DOI: https://doi.org/10.1007/3-211-27389-1_73

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-24934-5

  • Online ISBN: 978-3-211-27389-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics