Skip to main content

An Evolutionary Multi-objective Neural Network Optimizer with Bias-Based Pruning Heuristic

  • Conference paper
Advances in Neural Networks – ISNN 2007 (ISNN 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4493))

Included in the following conference series:

Abstract

Neural network design aims for high classification accuracy and low network architecture complexity. It is also known that simultaneous optimization of both model accuracy and complexity improves generalization while avoiding overfitting on data. We describe a neural network training procedure that uses multi-objective optimization to evolve networks which are optimal both with respect to classification accuracy and architectural complexity. The NSGA-II algorithm is employed to evolve a population of neural networks that are minimal in both training error and a Minimum Description Length-based network complexity measure. We further propose a pruning rule based on the following heuristic: connections to or from a node may be severed if their weight values are smaller than the network’s smallest bias. Experiments on benchmark datasets show that the proposed evolutionary multi-objective approach to neural network design employing the bias-based pruning heuristic yields networks that have far fewer connections without seriously compromising generalization performance when compared to other existing evolutionary optimization algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barron, A., Rissanen, J., Yu, B.: The Minimum Description Length Principle in Coding and Modeling. IEEE Trans. Information Theory 44, 2743–2760 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  2. Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Chichester (2001)

    MATH  Google Scholar 

  3. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II. IEEE Trans. Evolutionary Computation 6(2), 182–197 (2002)

    Article  Google Scholar 

  4. Fonseca, C.M., Fleming, P.J.: An Overview of Evolutionary Algorithms in Multiobjective Optimization. Evolutionary Computation 3(1), 1–16 (1995)

    Article  Google Scholar 

  5. Fonseca, C.M., Fleming, P.J.: Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization. In: Forrest, S. (ed.) Genetic Algorithms: Proceedings of the Fifth International Conference, pp. 416–423. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  6. Grunwald, P.: A Tutorial Introduction to the Minimum Description Length Principle. In: Advances in Minimum Description Length: Theory and Applications, MIT Press, Cambridge (2004)

    Google Scholar 

  7. Hinton, G., van Camp, D.: Keeping Neural Networks Simple by Minimizing the Description Length of the Weights. In: Proceedings of COLT-93 (1993)

    Google Scholar 

  8. Horn, J., Nafpliotis, N., Goldberg, D.E.: A Niched Pareto Genetic Algorithm for Multiobjective Optimization. In: Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, vol. 1, pp. 82–87. IEEE Service Center, Piscataway (1994)

    Chapter  Google Scholar 

  9. Newman, D., Hettich, S., Blake, C., Merz, C.: UCI Repository of Machine Learning Databases. University of California, Department of Information and Computer Science, Irvine, CA (1998)

    Google Scholar 

  10. Palmes, P., Hayasaka, T., Usui, S.: Mutation-based Genetic Neural Network. IEEE Trans. Neural Networks 6(3), 587–600 (2005)

    Article  Google Scholar 

  11. Reed, R.: Pruning algorithms - A Survey. IEEE Trans. Neural Networks 4(5), 740–747 (1993)

    Article  Google Scholar 

  12. Shahin, M., Jaksa, M., Maier, H.: Application of Neural Networks in Foundation Engineering. Theme paper to the International e-Conference on Modern Trends in Foundation Engineering: Geotechnical Challenges and Solutions, Theme No. 5: Numerical Modelling and Analysis, Chennai, India (2004)

    Google Scholar 

  13. Yao, X.: Evolving Artificial Neural Networks. Proceedings of the IEEE 87, 1423–1447 (1999)

    Article  Google Scholar 

  14. Yao, X., Liu, Y.: Evolving Artificial Neural Networks through Evolutionary Programming. Presented at the Fifth Annual Conference on Evolutionary Programming, 29 February-2 March 1996, San Diego, CA, USA, pp. 257–266. MIT Press, Cambridge (1996)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Naval, P.C., Yusiong, J.P.T. (2007). An Evolutionary Multi-objective Neural Network Optimizer with Bias-Based Pruning Heuristic. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72395-0_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72395-0_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72394-3

  • Online ISBN: 978-3-540-72395-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics