Skip to main content

SOMGA for Large Scale Function Optimization and Its Application

  • Chapter
  • First Online:
Self-Organizing Migrating Algorithm

Part of the book series: Studies in Computational Intelligence ((SCI,volume 626))

  • 890 Accesses

Abstract

Self Organizing Migrating Genetic Algorithm (SOMGA) is a hybridized variant of Genetic Algorithm (GA) inspired by the features of Self Organizing Migrating Algorithm, presented by Deep and Dipti (IEEE Congr Evol Comput, pp 2796–2803, 2007) [1]. SOMGA extracts the features of binary coded GA and real coded SOMA in such a way that diversity of the solution space can be maintained and thoroughly exploited keeping function evaluation low. It works with very less population size and tries to achieve global optimal solution faster in less number of function evaluations. Earlier SOMGA has been used to solve problems up to 10 dimensions with population size 10 only. This chapter is brake into three sections. In first section a possibility of using SOMGA to solve large scale problem (dimension up to 200) has been analyzed with the help of 13 test problems. The reason behind extension is that SOMGA works with very small population size and to solve large scale problems (dimension 200) only 20 population size is required. On the basis of results it has been concluded that SOMGA is efficient to solve large scale global optimization problems with small population size and hence required lesser function evaluations. In second section, two real life problems from the field of engineering as an application have been solved using SOMGA. In third section, a comparison between two ways of hybridization has been analyzed. There can be two approaches to hybridize a population based technique. Either by incorporating a deterministic local search in it or by merging it with other population based technique. To see the effect of both the approaches on GA, the results of SOMGA on five test problems are compared with the results of MA (GA+ deterministic local search). Results clearly indicates that SOMGA is less expensive and effective to solve these problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Deep, K., Singh, D.: A new hybrid self organizing migrating genetic algorithm for function optimization. In: IEEE Congress on Evolutionary Computation, pp. 2796–2803 (2007)

    Google Scholar 

  2. Grefensette, J.: Lamarckian learning in multi-agent environment In: Proceedings of the Fourth International Conference on Genetic Algorithms, San Mateo, CA, Morgan Kauffman (1994)

    Google Scholar 

  3. Kasprzyk, G.P., Jasku, M.: Application of hybrid genetic algorithms for deconvulation of electrochemical responses in SSLSV method. J. Electroanal. Chem. 567, 39–66 (2004)

    Article  Google Scholar 

  4. Chelouah, R., Siarry, P.: A hybrid method combining continuous Tabu search and Nelder–Mead simplex algorithm for global optimization of multiminima functions. Eur. J. Oper. Res. 161, 636–654 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  5. Wang, L., Tang, F., Wu, H.: Hybrid genetic algorithm based on quantum computing for numerical optimization and parameter estimation. Appl. Math. Comput. 171, 1141–1156 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  6. Javadi, A., Farmani, A.R., Tan, T.P.: A hybrid intelligent genetic algorithm. Adv. Eng. Inf. 19, 255–262 (2005)

    Article  Google Scholar 

  7. Fan, S.K.S., Liang, Y.C., Zahara, E.: a genetic algorithm and a particle swarm optimizer hybridized with Nelder–Mead simplex search. Comput. Ind. Eng. 50, 401–425 (2006)

    Article  Google Scholar 

  8. Hwang, F.S., Song, H.R.: A hybrid real parameter genetic algorithm for function optimization. Adv. Eng. Inf. 20, 7–21 (2006)

    Google Scholar 

  9. Zhang, G., Lu, H.: Hybrid real coded genetic algorithm with quasi-simplex technique. Int. J. Comput. Sci. Netw. Secur. 6(10), 246–255 (2006)

    Google Scholar 

  10. Wei, L., Zhao, M.: A Nitche hybrid genetic algorithm for global optimization of continuous multi modal functions. Appl. Math. Comput. 160, 649–661 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  11. Premalatha, K., Nataranjan, A.M.: Hybrid PSO and GA for global optimization. Int. J. Open Probl. Comput. Math. 2 (2009)

    Google Scholar 

  12. Khosravi, A., Lari, A., Addeh, J.: A new hybrid of evolutionary and conventional optimization algorithm. Appl. Math. Sci. 6, 815–825 (2012)

    MathSciNet  MATH  Google Scholar 

  13. Ghatei, S., et al.: A new hybrid algorithm for optimization using PSO and GDA. J. Basic Appl. Sci. Res. 2, 2336–2341 (2012)

    Google Scholar 

  14. Esmin, A., Matwin, S.: A hybrid particle swarm optimization algorithm with genetic mutation. Int. J. Innovative Comput. Inf. Control 9, 1919–1934 (2013)

    Google Scholar 

  15. Zelinka I., Lampinen, J.: SOMA-self organizing migrating algorithm. In: Mendal, 6th International Conference on Soft Computing, Brno, Czech Republic, vol. 80, issue-2, p. 214 (2000)

    Google Scholar 

  16. Oplatkova, Z., Zelinka, I.: Investigation on Shannon-Kotelnik theorem impact on SOMA algorithm performance. In: Proceedings 19th European Conference on Modelling and Simulation Yuri Merkuryev, Richard Zobel (2005)

    Google Scholar 

  17. Zelinka, I.: Analytic programming by means of soma algorithm. In: Proceeding of 8th International Conference on Soft Computing Mendel ’02, Brno, Czech Republic, pp. 93–101 (2002). ISBN 80-214-2135-5

    Google Scholar 

  18. Nolle, L., Zelinka, I.: SOMA applied to optimum work roll profile selection in the hot rolling of wide steel. In: Proceedings of the 17th European Simulation Multiconference ESM 2003, Nottingham, UK, pp. 53–58 (2003). ISBN 3-936150-25-7, 9-11

    Google Scholar 

  19. Nolle, L., Zelinka, I., Hopgood, A.A., Goodyear, A.: Comparision of an self organizing migration algorithm with simulated annealing and differential evolution for automated waveform tuning. Adv. Eng. Softw. 36, 645–653 (2005)

    Article  Google Scholar 

  20. Nolle, L.: SASS applied to optimum work roll profile selection in the hot rolling of wide steel. Knowl. Based Syst. 20(2), 203–208 (2007)

    Article  Google Scholar 

  21. Zelinka, I., Lampinen, J., Nolle, L.: On the theoretical proof of convergence for a class of SOMA search algorithms. In: Proceedings of the 7th International MENDEL Conference on Soft Computing, Brno, CZ, pp. 103–110, 6–8 June 2001. ISBN 80-214-1894-X

    Google Scholar 

  22. Zelinka, I., Oplatkova, Z., Nolle, L.: Boolean symmetry function synthesis by means of arbitrary evolutionary algorithms—comparative study. In: Proceedings of the 18th European Simulation Multiconference ESM 2004, Magdeburg, Germany, pp. 143–148, June 2004. ISBN 3-936150-35-4, 13-14

    Google Scholar 

  23. Onwubolu, C.G., Babu, B.V.: New Optimization Techniques in Engineering. Springer, Heidelberg (2004). ISBN 3-540-20167-X

    Book  MATH  Google Scholar 

  24. Prasad, B.N., Saini, J.S.: Optimal thermo hydraulic performance of artificially roughened solar air heaters. J. Solar Energy 47, 91–96 (1991)

    Article  Google Scholar 

  25. Pant, M.: Genetic Algorithms for Global Optimization and their Applications. Ph.D. thesis, Department of Mathematics, IIT Roorkee, Formerly University of Roorkee (2003)

    Google Scholar 

  26. Tsutsui, S., Fujimoto, Y.: Phenotypic forking genetic algorithm (p-fGA). In: IEEE International Conference on Evolutionary Computing (ICEC ’95), Vol. 2, pp. 556–572 (1995)

    Google Scholar 

  27. Bazaraa, M.S., Sherali, H.D., Shetty, C.M.: Nonlinear Programming Theory and Algorithms. Wiley, New York (1993)

    MATH  Google Scholar 

  28. Ali, M.M., Khompatraporn, C., Zabinasky, Z.: A numerical evaluation of several global optimization algorithms on selected benchmark test problems. J. Global Optim. 31, 635–672 (2005)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dipti Singh .

Editor information

Editors and Affiliations

Appendix

Appendix

This Appendix contains the list of 13 benchmark test problems taken from literature, which are used to evaluate the performance of the algorithm. These problems are unconstrained nonlinear optimization problems having a number of local as well as global optimal solutions. All the problems have varying difficulty level and contain unimodal as well as multi modal problems.

Problem 1: (Cosine Mixture Problem)

This problem is Cosine Mixture Function. The global optimum of this function is at (0, 0,…, 0) with fmin = −0.1n. where n is the dimension of the problem. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } - 0.1\sum\limits_{i = 1}^{n} {\cos \left( {5\pi x_{i} } \right)} ,{\kern 1pt} \quad for\;x_{i} \in \left[ { - 1,{\kern 1pt} 1} \right]. $$

Problem 2: (Exponential Problem)

This problem is Exponential Function. The global optimum of this function is at (0, 0,…, 0) with fmin = −1. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \exp \left( { - 0.5\sum\limits_{i = 1}^{n} {x_{i}^{2} } } \right),\quad for\;x_{i} \in \left[ { - 1,{\kern 1pt} 1} \right]. $$

Problem 3: (Ackley Function)

This problem is the Ackley function. The surface of the Ackley function has numerous local minima due to its exponential terms. Any search algorithm based on the gradient information will be trapped in local optima, but any search strategy that analyzes a wider region will be able to cross the valley among the optima and achieve better results. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ f\left( x \right) = 20 + e - 20e^{{ - \left( {\frac{1}{5}\sqrt {\frac{1}{n}\sum\limits_{i = 1}^{n} {x_{i}^{2} } } } \right)}} - e^{{ - \left( {\frac{i}{n}\sum\limits_{i = 1}^{n} {\cos \left( {2\pi x_{i} } \right)} } \right)}} ,\quad for\;x_{i} \in \left[ { - 15,30} \right]. $$

Problem 4: (Sphere Function Problem)

The next problem is Sphere Function. This problem is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } ,\quad for\;x_{i} \in \left[ { - 5.12,5.12} \right]. $$

Problem 5: (Griewank Function)

This problem is a widely employed test function for global optimization, the Griewank function. While this function has an exponentially increasing number of local minima as its dimension increases, it turns out that a simple Multistart algorithm is able to detect its global minimum more and more easily as the dimension increases. The optima of this function are regularly distributed. Number of local minima for arbitrary n is unknown, but in two dimensional case there are some 500 local minima. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ f(x) = \sum\limits_{i = 1}^{n} {\frac{{x_{i}^{2} }}{4000}} - \prod\limits_{i = 1}^{n} {\cos \left( {\frac{{x_{i} }}{\sqrt i }} \right)} - 1,\quad for\;x_{i} \in \left[ { - 5.12,5.12} \right]. $$

Problem 6: (Axis Parallel Hyper Ellipsoid)

This problem is Axis Parallel Hyper Ellipsoid Function. This test problem is similar to sphere problem function. It is also known as the weighted sphere model. It is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {ix_{i}^{2} } ,\quad for\;x_{i} \in \left[ { - 5.12,5.12} \right]. $$

Problem 7: (Schwefel’s Double Sum)

This problem is Schwefel’s double sum Function. This function is an extension of axis parallel hyper ellipsoid function. It produces a rotated hype-ellipsoid. It is continuous convex and unimodal. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {\left( {\sum\limits_{j = 1}^{i} {x_{j} } } \right)^{2} } ,\quad for\;x_{i} \in \left[ { - 65.536,\,65.536} \right]. $$

Problem 8: (Rastrigin Function)

This problem is the Rastrigin Function. It is the extended form of the sphere function with a modulator term α · cos(2πxi). This function consists of a large number of local minima (not exactly known) whose value increases with the distance to the global minimum. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ f\left( x \right) = 10n + \sum\limits_{i = 1}^{n} {\left( {x_{i}^{2} - 10\cos \left( {2\pi x_{i} } \right)} \right)} ,\quad for\;x_{i} \in \left[ { - 5.12,5.12} \right]. $$

Problem 9: (Rosenbrock Function)

This problem is the Rosenbrock function, also known as the banana function. It is a continuous, differentiable, unimodal and non separable function. Its difficulty arises due to nonlinear interaction between parameters. The global optimum is inside a long narrow parabolic shaped flat valley. Its global minimum is at (1, 1,…, 1) with fmin = 0. The functional form is as follows:

$$ f\left( x \right) = \sum\limits_{i = 1}^{n - 1} {(100(x_{i + 1} - x_{i}^{2} )^{2} + \left( {x_{i}^{{}} - 1} \right)}^{2} ),\quad for\;x_{i} \in \left[ { - 2.048,2.048} \right]. $$

Problem 10: (Schwefel Function)

This problem is Schwefel Function. The contour of this function is made up of a great number of peaks and valleys. This function has a second best minimum far from the global minimum, so it is difficult for many algorithms to locate the global optimum of this function. Its global minimum is at (1, 1,…, 1) with fmin = 0. The functional form is as follows:

$$ f\left( x \right) = 418.9829n - \sum\limits_{i = 1}^{n} {\left( {x_{i} \sin \sqrt {\left| {x_{i} } \right|} } \right)} ,\quad for\;x_{i} \in \left[ { - 500,500} \right]. $$

Problem 11: (Zakharov’s Problem)

This problem is Zakharov Function. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {x_{i}^{2} } + \left( {\sum\limits_{i = 1}^{n} {\frac{i}{2}x_{i} } } \right)^{2} + \left( {\sum\limits_{i = 1}^{n} {\frac{i}{2}x_{i} } } \right)^{4} ,\quad for\;x_{i} \in \left[ { - 5.12,5.12} \right]. $$

Problem 12: (Ellipsoidal Function)

This problem is Ellipsoidal Function. Its global minimum is at (1, 2,…, n) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \sum\limits_{i = 1}^{n} {\left( {x_{i} - i} \right)^{2} } ,\quad for\;x_{i} \in \left[ { - n,n} \right]. $$

Problem 13: (Schwefel Problem 4)

This problem is Schwefel Problem 4 Function. Its global minimum is at (0, 0,…, 0) with fmin = 0. The functional form is as follows:

$$ \mathop {\hbox{min} }\limits_{x} f\left( x \right) = \mathop {\hbox{max} \left\{ {\left| {x_{i} } \right|,1 \le i \le n} \right\}}\limits_{i} ,\quad for\;x_{i} \in \left[ { - 100,100} \right]. $$

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Singh, D., Deep, K. (2016). SOMGA for Large Scale Function Optimization and Its Application. In: Davendra, D., Zelinka, I. (eds) Self-Organizing Migrating Algorithm. Studies in Computational Intelligence, vol 626. Springer, Cham. https://doi.org/10.1007/978-3-319-28161-2_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-28161-2_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-28159-9

  • Online ISBN: 978-3-319-28161-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics