Skip to main content
Log in

A Cognitively Inspired Hybridization of Artificial Bee Colony and Dragonfly Algorithms for Training Multi-layer Perceptrons

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

The objective of this article is twofold. On the one hand, we introduce a cognitively inspired hybridization metaheuristic that combines the strengths of two existing metaheuristics: the artificial bee colony (ABC) algorithm and the dragonfly algorithm (DA). The aim of this hybridization is to reduce the problems of slow convergence and trapping into local optima, by striking a good balance between global and local search components of the constituent algorithms. On the other hand, we use the proposed metaheuristic to train a multi-layer perceptron (MLP) as an alternative to existing traditional- and metaheuristic-based learning algorithms; this is for the purpose of improving overall accuracy by optimizing the set of MLP weights and biases. The proposed hybrid ABC/DA (HAD) algorithm comprises three main components: the static and dynamic swarming behavior phase in DA and two global search phases in ABC. The first one performs global search (DA phase), the second one performs local search (onlooker phase), and the third component implements global search (modified scout bee phase). The resultant metaheuristic optimizer is employed to train an MLP to reach a set of weights and biases that can yield high performance compared to traditional learning algorithms or even other metaheuristic optimizers. The proposed algorithm was first evaluated using 33 benchmark functions to test its performance in numerical optimization problems. Later, using HAD for training MLPs was evaluated against six standard classification datasets. In both cases, the performance of HAD was compared with the performance of several new and old metaheuristic methods from swarm intelligence and evolutionary computing. Experimental results show that HAD algorithm is clearly superior to the standard ABC and DA algorithms, as well as to other well-known algorithms, in terms of achieving the best optimal value, convergence speed, avoiding local minima and accuracy of trained MLPs. The proposed algorithm is a promising metaheuristic technique for general numerical optimization and for training MLPs. Specific applications and use cases are yet to be explored fully but they are supported by the encouraging results in this study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

References

  1. Kim S-S, McLoone S, Byeon J-H, Lee S, Liu H. Cognitively inspired artificial bee colony clustering for cognitive wireless sensor networks. Cogn Comput. 2017;9(2):207–24.

    Article  Google Scholar 

  2. Fernández-Caballero A, González P, Navarro E. Cognitively-inspired computing for gerontechnology. Cogn Comput. 2016;8(2):297–8.

    Article  Google Scholar 

  3. Bonabeau E, Dorigo M, Theraulaz G. Swarm intelligence: from natural to artificial systems. Press: Oxford Univ; 1999.

    Google Scholar 

  4. Mavrovouniotis M, Li C, Yang S. A survey of swarm intelligence for dynamic optimization: algorithms and applications. Swarm Evol Comput. 2017;33:1–17.

    Article  Google Scholar 

  5. Chen J, Zeng Z, Jiang P, Tang H. Deformation prediction of landslide based on functional network. Neurocomputing. 2015;149:151–7.

    Article  Google Scholar 

  6. Ghanem WAHM, Jantan A. Using hybrid artificial bee colony algorithm and particle swarm optimization for training feed-forward neural networks. J Theor Appl Inf Technol. 2014;67(3)

  7. Mirjalili SA, Hashim SZM, Sardroudi HM. Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl Math Comput. 2012;218(22):11125–37.

    Google Scholar 

  8. Ghanem WAHM, Jantan A. Novel multi-objective artificial bee Colony optimization for wrapper based feature selection in intrusion detection. Int J Adv Soft Comput Appl. 2016;8(1)

  9. Bandaru S, Ng AHC, Deb K. Data mining methods for knowledge discovery in multi-objective optimization: Part A-Survey. Expert Syst Appl. 2017;70:139–59.

    Article  Google Scholar 

  10. Ali MH, Al Mohammed BAD, Ismail A, Zolkipli MF. A new intrusion detection system based on Fast Learning Network and Particle swarm optimization. IEEE Access. 2018;6:20255–61.

    Article  Google Scholar 

  11. Ghanem WAHM, Jantan A. New approach to improve anomaly detection using a neural network optimized by hybrid abc and pso algorithms. Pak J Stat. 2018;34(1)

  12. Han X, Chang X, Quan L, Xiong X, Li J, Zhang Z, et al. Feature subset selection by gravitational search algorithm optimization. Inf Sci. 2014;281:128–46.

    Article  Google Scholar 

  13. Aljarah I, Al-Zoubi A’M, Faris H, Hassonah MA, Mirjalili S, Saadeh H. Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cogn Comput. 2018:1–18.

  14. Yang, Xin-She. A new metaheuristic bat-inspired algorithm. In Nature inspired cooperative strategies for optimization (NICSO 2010), pp. 65–74. Springer, Berlin, Heidelberg, 2010.

  15. Yang, Xin-She, and Suash Deb. Cuckoo search via Lévy flights. In Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on, pp. 210–214. IEEE, 2009.

  16. An J, Kang Q, Wang L, Qidi W. Mussels wandering optimization: an ecologically inspired algorithm for global optimization. Cogn Comput. 2013;5(2):188–99.

    Article  Google Scholar 

  17. Eberhart, Russell, and James Kennedy. A new optimizer using particle swarm theory. In Micro Machine and Human Science, 1995. MHS'95., Proceedings of the Sixth International Symposium on, pp. 39–43. IEEE, 1995.

  18. Wang, Gai-Ge, Suash Deb, and Leandro dos S. Coelho. Elephant herding optimization. In Computational and Business Intelligence (ISCBI), 2015 3rd International Symposium on, pp. 1–5. IEEE, 2015.

  19. Mirjalili S. Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl-Based Syst. 2015;89:228–49.

    Article  Google Scholar 

  20. Mirjalili S, Lewis A. The whale optimization algorithm. Adv Eng Softw. 2016;95:51–67.

    Article  Google Scholar 

  21. Wang G-G, Deb S, Cui Z. Monarch butterfly optimization. Neural Comput Appl. 2015:1–20.

  22. Gandomi, Amir Hossein, and Amir Hossein Alavi. Krill herd: a new bio-inspired optimization algorithm. Communications in Nonlinear Science and Numerical Simulation 17, no. 12 (2012): 4831–4845.

  23. Wang G-G. Moth search algorithm: a bio-inspired metaheuristic algorithm for global optimization problems. Memetic Computing. 2016:1–14.

  24. Mirjalili S. The ant lion optimizer. Adv Eng Softw. 2015;83:80–98.

    Article  Google Scholar 

  25. Storn R, Price K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim. 1997;11(4):341–59.

    Article  Google Scholar 

  26. Simon D. Biogeography-based optimization. IEEE Trans Evol Comput. 2008;12(6):702–13.

    Article  Google Scholar 

  27. Beyer H-G, Schwefel H-P. Evolution strategies–a comprehensive introduction. Nat Comput. 2002;1(1):3–52.

    Article  Google Scholar 

  28. Goldberg DE, Holland JH. Genetic algorithms and machine learning. Mach Learn. 1988;3(2):95–9.

    Article  Google Scholar 

  29. Rashedi E, Nezamabadi-Pour H, Saryazdi S. GSA: a gravitational search algorithm. Inf Sci. 2009;179(13):2232–48.

    Article  Google Scholar 

  30. Geem ZW, Kim JH, Loganathan GV. A new heuristic optimization algorithm: harmony search. Simulation. 2001;76(2):60–8.

    Article  Google Scholar 

  31. Mirjalili S. SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst. 2016;96:120–33.

    Article  Google Scholar 

  32. Ghanem, Waheed Ali HM, and Aman Jantan. An enhanced Bat algorithm with mutation operator for numerical optimization problems" Neural Comput & Applic (2017): 1–35.

  33. Ghanem WAHM, Jantan A. A novel hybrid artificial bee colony with monarch butterfly optimization for global optimization problems. In: Modeling, Simulation, and Optimization. Cham: Springer; 2018. p. 27–38.

    Chapter  Google Scholar 

  34. Ghanem WAHM, Jantan A. Hybridizing Bat algorithm with modified pitch adjustment operator for numerical optimization problems. In: Modeling, Simulation, and Optimization. Cham: Springer; 2018. p. 57–69.

    Chapter  Google Scholar 

  35. Ghanem WAHM, Jantan A. Hybridizing artificial bee colony with monarch butterfly optimization for numerical optimization problems. Neural Comput & Applic. 2018;30(1):163–81.

    Article  Google Scholar 

  36. Wang G-G, Gandomi AH, Alavi AH, Hao G-S. Hybrid krill herd algorithm with differential evolution for global numerical optimization. Neural Comput & Applic. 2014;25(2):297–308.

    Article  CAS  Google Scholar 

  37. Mirjalili, Seyedali, and Siti Zaiton Mohd Hashim. A new hybrid PSOGSA algorithm for function optimization. In Computer and information application (ICCIA), 2010 international conference on, pp. 374–377. IEEE, 2010.

  38. Siddique N, Adeli H. Nature-inspired chemical reaction optimisation algorithms. Cogn Comput. 2017;9(4):411–22.

    Article  Google Scholar 

  39. Wu T, Yao M, Yang J. Dolphin swarm extreme learning machine. Cogn Comput. 2017;9(2):275–84.

    Article  Google Scholar 

  40. Karaboga, Dervis. An idea based on honey bee swarm for numerical optimization. Vol. 200. Technical report-tr06, Erciyes university, engineering faculty, computer engineering department, 2005.

  41. Karaboga D, Basturk B. On the performance of artificial bee colony (ABC) algorithm. Appl Soft Comput. 2008;8(1):687–97.

    Article  Google Scholar 

  42. Mirjalili S. Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput & Applic. 2016;27(4):1053–73.

    Article  Google Scholar 

  43. Floudas, Christodoulos A. Deterministic global optimization: theory, methods and applications. Vol. 37. Springer Science & Business Media, 2013.

  44. Horst, Reiner, and Hoang Tuy. Global optimization: deterministic approaches. Springer Science & Business Media, 2013.

  45. Ojha VK, Abraham A, Snášel V. Metaheuristic design of feedforward neural networks: a review of two decades of research. Eng Appl Artif Intell. 2017;60:97–116.

    Article  Google Scholar 

  46. Zhang N, Ding S, Shi Z. Denoising Laplacian multi-layer extreme learning machine. Neurocomputing. 2016;171:1066–74.

    Article  Google Scholar 

  47. Meng L, Ding S, Yu X. Research on denoising sparse autoencoder. Int J Mach Learn Cybern. 2017;8(5):1719–29.

    Article  Google Scholar 

  48. Malakooti B, Zhou Y. Approximating polynomial functions by feedforward artificial neural networks: capacity analysis and design. Appl Math Comput. 1998;90(1):27–51.

    Google Scholar 

  49. Isa NAM, Mamat WMFW. Clustered-hybrid multilayer perceptron network for pattern recognition application. Appl Soft Comput. 2011;11(1):1457–66.

    Article  Google Scholar 

  50. Melin P, Sánchez D, Castillo O. Genetic optimization of modular neural networks with fuzzy response integration for human recognition. Inf Sci. 2012;197:1–19.

    Article  Google Scholar 

  51. Guo ZX, Wong WK, Li M. Sparsely connected neural network-based time series forecasting. Inf Sci. 2012;193:54–71.

    Article  Google Scholar 

  52. Suganuma, Masanori, Mete Ozay, and Takayuki Okatani. Exploiting the potential of standard convolutional autoencoders for image restoration by evolutionary search. arXiv preprint arXiv:1803.003 70 (2018).

  53. Wang, Yunhe, Chang Xu, Jiayan Qiu, Chao Xu, and Dacheng Tao.Towards evolutional compression arXiv preprint arXiv:1707.08005 (2017).

  54. Real, Esteban, Sherry Moore, Andrew Selle, Saurabh Saxena, Yutaka Leon Suematsu, Jie Tan, Quoc Le, and Alex Kurakin. Large-scale evolution of image classifiers. arXiv preprint arXiv:1703.01041 (2017).

  55. Zhang J-R, Zhang J, Lok T-M, Lyu MR. A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput. 2007;185(2):1026–37.

    Google Scholar 

  56. Mirjalili S, Mirjalili SM, Lewis A. Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci. 2014;269:188–209.

    Article  Google Scholar 

  57. Ampazis N, Perantonis SJ, Drivaliaris D. Improved Jacobian Eigen-analysis scheme for accelerating learning in feedforward neural networks. Cognitive Computation. 2015;7(1):86–102.

    Article  Google Scholar 

  58. Mirjalili S. How effective is the Grey Wolf optimizer in training multi-layer perceptrons. Appl Intell. 2015;43(1):150–61.

    Article  Google Scholar 

  59. Zhu G, Kwong S. Gbest-guided artificial bee colony algorithm for numerical function optimization. Appl Math Comput. 2010;217(7):3166–73.

    Google Scholar 

  60. Li Z, Wang W, Yan Y, Zheng L. PS–ABC: a hybrid algorithm based on particle swarm and artificial bee colony for high-dimensional optimization problems. Expert Syst Appl. 2015;42(22):8881–95.

    Article  Google Scholar 

  61. Yılmaz S, Küçüksille EU. A new modification approach on bat algorithm for solving optimization problems. Appl Soft Comput. 2015;28:259–75.

    Article  Google Scholar 

  62. Wang G-G, Guo L, Gandomi AH, Hao G-S, Wang H. Chaotic krill herd algorithm. Inf Sci. 2014;274:17–34.

    Article  Google Scholar 

  63. Dheeru D and Karra Taniskidou E., {UCI} Machine Learning Repository. 2017.

Download references

Acknowledgements

The authors sincerely thanks the editors and anonymous reviewers for their helpful suggestions on how to improve the presentation of the article.

Funding

This study was funded by Universiti Sains Malaysia Fellowship (grant number [APEX (308/AIPS/415401)]) and the RUI grant, Account No. [1001/PKOMP/8014017] also under the Universiti Sains Malaysia.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Waheed A. H. M. Ghanem.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ghanem, W.A.H.M., Jantan, A. A Cognitively Inspired Hybridization of Artificial Bee Colony and Dragonfly Algorithms for Training Multi-layer Perceptrons. Cogn Comput 10, 1096–1134 (2018). https://doi.org/10.1007/s12559-018-9588-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-018-9588-3

Keywords

Navigation