Skip to main content

A Hybrid Gravitational Search Algorithm and Back-Propagation for Training Feedforward Neural Networks

  • Conference paper
Knowledge and Systems Engineering

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 326))

Abstract

Presenting a satisfactory and efficient training algorithm for artificial neural networks (ANN) has been a challenging task. The Gravitational Search Algorithm (GSA) is a novel heuristic algorithm based on the law of gravity and mass interactions. Like most other heuristic algorithms, this algorithm has a good ability to search for the global optimum, but suffers from slow searching speed. On the contrary, the Back-Propagation (BP) algorithmcan achieve a faster convergent speed around the global optimum. In this study, a hybrid of GSA and BP is proposed to make use of the advantage of both the GSA and BP algorithms. The proposed hybrid algorithm is employed as a new training method for feedforward neural networks (FNNs). To investigate the performance of the proposed approach, two benchmark problems are used and the results are compared with those obtained from FNNs trained by original GSA and BP algorithms. The experimental results show that the proposed hybrid algorithm outperforms both GSA and BP in training FNNs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Paliwal, M., Kumar, U.A.: Neural networks and statistical techniques: A review of applications. Expert Systems with Applications 36(1), 2–17 (2009)

    Article  Google Scholar 

  2. Vosniakos, G.C., Benardos, P.G.: Optimizing feedforward Artificial Neural Network Architecture. Engineering Applications of Artificial Intelligence 20(3), 365–382 (2007)

    Article  Google Scholar 

  3. Funahashi, K.: On the approximate realization of continuous mappings by neural networks. Neural Networks 2(3), 183–192 (1989)

    Article  Google Scholar 

  4. Norgaard, M.R.O., Poulsen, N.K., Hansen, L.K.: Neural networks for modeling and control of dynamic systems. In: A Practitioner’s Handbook. Springer, London (2000)

    Google Scholar 

  5. Mat Isa, N.: Clustered-hybrid multilayer perceptron network for pattern recognition application. Applied Soft Computing 11(1), 1457–1466 (2011)

    Article  Google Scholar 

  6. Homik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2, 359–366 (1989)

    Article  Google Scholar 

  7. Malakooti, B., Zhou, Y.: Approximating polynomial functions by feedforward artificial neural network: capacity analysis and design. Applied Mathematics and Computation 90(1), 27–52 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hush, R., Horne, N.G.: Progress in supervised neural networks. IEEE Signal Processing Magazine 10, 8–39 (1993)

    Article  Google Scholar 

  9. Hagar, M.T., Menhaj, M.B.: Training feedforward networks with the Marquardt algorithm. IEEE Transactions Neural Networks 5(6), 989–993 (1994)

    Article  Google Scholar 

  10. Adeli, H., Hung, S.L.: An adaptive conjugate gradient learning algorithm for efficient training of neural networks. Applied Mathematics and Computation 62(1), 81–102 (1994)

    Article  MATH  Google Scholar 

  11. Zhang, N.: An online gradient method with momentum for two-layer feedforward neural networks. Applied Mathematics and Computation 212(2), 488–498 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gupta, J.N.D., Sexton, R.S.: Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6), 679–684 (1999)

    Article  Google Scholar 

  13. Mirjalili, S.A., Mohd Hashim, S.Z., Sardroudi, H.M.: Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation 218(22), 11125–11137 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gori, M., Tesi, A.: On the problem of local minima in back-propagation. IEEE Transactions on Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)

    Article  Google Scholar 

  15. Zhang, J.R., Zhang, J., Lock, T.M., Lyu, M.R.: A hybrid particle swarm optimization back-propagation algorithm for feedforward neural network training. Applied Mathematics and Computation 185(2), 1026–1037 (2007)

    Article  MATH  Google Scholar 

  16. Goldberg, E.: Genetic algorithms in search, optimization and machine learning. Addison Wesley, Boston (1989)

    MATH  Google Scholar 

  17. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)

    Google Scholar 

  18. Dorigo, M., Maniezzo, V., Golomi, A.: Ant system: optimization by a colony of cooperating agents. IEEE Transactions on Systems, Man, and Cybernetics 26(1), 29–41 (1996)

    Article  Google Scholar 

  19. Kiranyaz, S., Ince, T., Yildirim, A., Gabbouj, M.: Evolutionary artificial neural networks by multi-dimensional particle swarm optimization. Neural Networks 22(10), 1448–1462 (2009)

    Article  Google Scholar 

  20. Rashedi, E., Nezamabadi-pour, H., Saryazdi, S.: GSA: A Gravitational Search Algorithm. Information Sciences 179(13), 2232–2248 (2009)

    Article  MATH  Google Scholar 

  21. Gauci, M., Dodd, T.J., Grob, R.: Why ’GSA: a gravitational search algorithm’ is not genuinely based on the law of gravity. Natural Computing 11(4), 719–720 (2012)

    Article  MathSciNet  Google Scholar 

  22. Duman, S., Guvenc, U., Yorukeren, N.: Gravitational Search Algorithm for Economic Dispatch with Valve-Point Effects. International Review of Electrical Engineering 5, 2890–2895 (2010)

    Google Scholar 

  23. BoussaiD, I., Lepagnot, J., Siarry, P.: A survey on optimization metaheuristics. Information Sciences 237, 82–117 (2013)

    Article  MathSciNet  Google Scholar 

  24. Ho, Y.C., Pepyne, D.L.: Simple Explanation of the No-Free-Lunch Theorem and Its Implications. Journal of Optimization Theory and Applications 115(3), 549–570 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  25. Mirjalili, S., Hashim, S.Z.M.: A New Hybrid PSOGSA Algorithm for Function Optimization. In: Proceedings of the International Conference on Computer and Information Application (ICCIA 2010), pp. 374–377 (2010)

    Google Scholar 

  26. Li, L.K., Shao, S., Yiu, K.F.C.: A new optimization algorithm for single hidden layer feedforward neural networks. Applied Soft Computing 13(5), 2857–2862 (2013)

    Article  Google Scholar 

  27. Caruana, R., Lawrence, S., Giles, C.L.: Overfitting in neural networks: backpropagation. In: Proceedings of 13th Conference on Advances Neural Information Processing Systems, USA, pp. 402–408 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Quang Hung Do .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Do, Q.H. (2015). A Hybrid Gravitational Search Algorithm and Back-Propagation for Training Feedforward Neural Networks. In: Nguyen, VH., Le, AC., Huynh, VN. (eds) Knowledge and Systems Engineering. Advances in Intelligent Systems and Computing, vol 326. Springer, Cham. https://doi.org/10.1007/978-3-319-11680-8_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11680-8_30

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11679-2

  • Online ISBN: 978-3-319-11680-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics