Abstract
This paper describes genetic algorithm of neural network training with automatic architecture generation, proposed method parallelization of training and modification to this method. And contains A comparative analysis of the original algorithm without the use of parallelization with proposed parallelization algorithm by splitting into groups and exchange of individuals between groups.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Sierra-Canto, X., Madera-Raminez, F., Uc-Centina, V.: Parallel training of a back-propaganation neural networks using CUDA. In: 9th International Conference on Machine Learning and Applications, pp. 307–312. IEEE Computer Society, Washington (2010)
Kattan, A.R.M., Abdullah, R., Salam, R.A.: Training Feed-Forward Neural Networks Using a Parallel Genetic Algorithm with the Best Must Survive Strategy. In: Conference on Intelligent Systems, Modelling and Simulation, pp. 96–99. IEEE Computer Society, Liverpool (2010)
Scanzio, S., Cumani, S., Gemello, R., Manip, F., Laface, P.: Parallel implementation of Artificial Neural Network training for speech recognition. Pattern Recognition Letters 31(11), 1302–1309 (2010), http://Elsevier.com
Vesely, K., Burget, L., Grezl, F.: Parallel Training of Neural Networks for Speech Recognition. In: Interspeech 2010 (2010), http://noel.feld.cvut.cz/gacr0811/publ/VES10b.pdf
Montana, D.J., Davis, L.: Training Feedforward Neural Networks Using Genetic Algorithms, http://ijcai.org
Saratchandran, P., Sundararajan, N., Foo, S.K.: Parallel Implementations of Backpropagation Neural Networks on Transputers: A Study of Training Set Parallelism, River Edge, NJ. World Scientific, Singapore (1996)
Sittig, D.F., Orr, J.A.: A parallel implementation of the backward error propagation neural network training algorithm: experiments in event identification. J. Computers and Biomedical Research 25(6), 547–561 (1992)
Li, C.H., Yang, L.T., Li, M.: Parallel Training of An Impovered Neural Networks for Text Categorization. International Journal of Parallel Programming 42(3), 505–523 (2013)
Guan, S.-W., Li, S.: Parallel Growing and Training of Neural Networks Using Output Parallelism. J. IEEE Transactions on Neural Networks 13(3), 542–550 (2002)
Stanley, K., Miikkulainen, R.: Evolving neural networks through augmenting topologies. J. Evolutionary Computation 10, 99–127 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Bilgaeva, L., Burlov, N. (2014). Parallelization of the Genetic Algorithm in Training of the Neural Network Architecture with Automatic Generation. In: Dudin, A., Nazarov, A., Yakupov, R., Gortsev, A. (eds) Information Technologies and Mathematical Modelling. ITMM 2014. Communications in Computer and Information Science, vol 487. Springer, Cham. https://doi.org/10.1007/978-3-319-13671-4_6
Download citation
DOI: https://doi.org/10.1007/978-3-319-13671-4_6
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-13670-7
Online ISBN: 978-3-319-13671-4
eBook Packages: Computer ScienceComputer Science (R0)