Abstract
Presenting a satisfactory and efficient training algorithm for artificial neural networks (ANN) has been a challenging task. The Gravitational Search Algorithm (GSA) is a novel heuristic algorithm based on the law of gravity and mass interactions. Like most other heuristic algorithms, this algorithm has a good ability to search for the global optimum, but suffers from slow searching speed. On the contrary, the Back-Propagation (BP) algorithmcan achieve a faster convergent speed around the global optimum. In this study, a hybrid of GSA and BP is proposed to make use of the advantage of both the GSA and BP algorithms. The proposed hybrid algorithm is employed as a new training method for feedforward neural networks (FNNs). To investigate the performance of the proposed approach, two benchmark problems are used and the results are compared with those obtained from FNNs trained by original GSA and BP algorithms. The experimental results show that the proposed hybrid algorithm outperforms both GSA and BP in training FNNs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Paliwal, M., Kumar, U.A.: Neural networks and statistical techniques: A review of applications. Expert Systems with Applications 36(1), 2–17 (2009)
Vosniakos, G.C., Benardos, P.G.: Optimizing feedforward Artificial Neural Network Architecture. Engineering Applications of Artificial Intelligence 20(3), 365–382 (2007)
Funahashi, K.: On the approximate realization of continuous mappings by neural networks. Neural Networks 2(3), 183–192 (1989)
Norgaard, M.R.O., Poulsen, N.K., Hansen, L.K.: Neural networks for modeling and control of dynamic systems. In: A Practitioner’s Handbook. Springer, London (2000)
Mat Isa, N.: Clustered-hybrid multilayer perceptron network for pattern recognition application. Applied Soft Computing 11(1), 1457–1466 (2011)
Homik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks 2, 359–366 (1989)
Malakooti, B., Zhou, Y.: Approximating polynomial functions by feedforward artificial neural network: capacity analysis and design. Applied Mathematics and Computation 90(1), 27–52 (1998)
Hush, R., Horne, N.G.: Progress in supervised neural networks. IEEE Signal Processing Magazine 10, 8–39 (1993)
Hagar, M.T., Menhaj, M.B.: Training feedforward networks with the Marquardt algorithm. IEEE Transactions Neural Networks 5(6), 989–993 (1994)
Adeli, H., Hung, S.L.: An adaptive conjugate gradient learning algorithm for efficient training of neural networks. Applied Mathematics and Computation 62(1), 81–102 (1994)
Zhang, N.: An online gradient method with momentum for two-layer feedforward neural networks. Applied Mathematics and Computation 212(2), 488–498 (2009)
Gupta, J.N.D., Sexton, R.S.: Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6), 679–684 (1999)
Mirjalili, S.A., Mohd Hashim, S.Z., Sardroudi, H.M.: Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation 218(22), 11125–11137 (2012)
Gori, M., Tesi, A.: On the problem of local minima in back-propagation. IEEE Transactions on Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)
Zhang, J.R., Zhang, J., Lock, T.M., Lyu, M.R.: A hybrid particle swarm optimization back-propagation algorithm for feedforward neural network training. Applied Mathematics and Computation 185(2), 1026–1037 (2007)
Goldberg, E.: Genetic algorithms in search, optimization and machine learning. Addison Wesley, Boston (1989)
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)
Dorigo, M., Maniezzo, V., Golomi, A.: Ant system: optimization by a colony of cooperating agents. IEEE Transactions on Systems, Man, and Cybernetics 26(1), 29–41 (1996)
Kiranyaz, S., Ince, T., Yildirim, A., Gabbouj, M.: Evolutionary artificial neural networks by multi-dimensional particle swarm optimization. Neural Networks 22(10), 1448–1462 (2009)
Rashedi, E., Nezamabadi-pour, H., Saryazdi, S.: GSA: A Gravitational Search Algorithm. Information Sciences 179(13), 2232–2248 (2009)
Gauci, M., Dodd, T.J., Grob, R.: Why ’GSA: a gravitational search algorithm’ is not genuinely based on the law of gravity. Natural Computing 11(4), 719–720 (2012)
Duman, S., Guvenc, U., Yorukeren, N.: Gravitational Search Algorithm for Economic Dispatch with Valve-Point Effects. International Review of Electrical Engineering 5, 2890–2895 (2010)
BoussaiD, I., Lepagnot, J., Siarry, P.: A survey on optimization metaheuristics. Information Sciences 237, 82–117 (2013)
Ho, Y.C., Pepyne, D.L.: Simple Explanation of the No-Free-Lunch Theorem and Its Implications. Journal of Optimization Theory and Applications 115(3), 549–570 (2002)
Mirjalili, S., Hashim, S.Z.M.: A New Hybrid PSOGSA Algorithm for Function Optimization. In: Proceedings of the International Conference on Computer and Information Application (ICCIA 2010), pp. 374–377 (2010)
Li, L.K., Shao, S., Yiu, K.F.C.: A new optimization algorithm for single hidden layer feedforward neural networks. Applied Soft Computing 13(5), 2857–2862 (2013)
Caruana, R., Lawrence, S., Giles, C.L.: Overfitting in neural networks: backpropagation. In: Proceedings of 13th Conference on Advances Neural Information Processing Systems, USA, pp. 402–408 (2001)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Do, Q.H. (2015). A Hybrid Gravitational Search Algorithm and Back-Propagation for Training Feedforward Neural Networks. In: Nguyen, VH., Le, AC., Huynh, VN. (eds) Knowledge and Systems Engineering. Advances in Intelligent Systems and Computing, vol 326. Springer, Cham. https://doi.org/10.1007/978-3-319-11680-8_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-11680-8_30
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11679-2
Online ISBN: 978-3-319-11680-8
eBook Packages: EngineeringEngineering (R0)