Skip to main content

Multilevel Optimization Algorithms Based on Metamodel- and Fitness Inheritance-Assisted Evolutionary Algorithms

  • Chapter
Computational Intelligence in Expensive Optimization Problems

Part of the book series: Adaptation Learning and Optimization ((ALO,volume 2))

Abstract

This chapter is concerned with the efficient use of metamodel-assisted evolutionary algorithms built in multilevel or hierarchical schemes for the solution of computationally expensive optimization problems. Existing methods developed by other researchers or the authors’ group are overviewed and a new enhancement based on fitness inheritance is proposed. Whereas conventional evolutionary algorithms require a great number of calls to the evaluation software, the use of low cost surrogates or metamodels, trained on the fly on previously evaluated individuals for pre-evaluating the evolving populations, reduce noticeably the CPU cost of an optimization. Since, themetamodel training requires a minimum amount of previous evaluations, the starting population is evaluated on the problem-specific model. Fitness inheritance is introduced in this context so as to approximate the objective function values in place of metamodels. In addition, to profit of the availability of evaluation or parameterization models of lower fidelity and CPU cost and/or refinement methods, a multilevel search algorithm relying also on the use of metamodels is presented. The algorithm may optionally operate as hierarchical-distributed (many levels performing distributed optimization) or distributed-hierarchical (more than one sub-populations undergoing their own hierarchical optimizations) to further reduce the design cycle time. The proposed algorithms are generic and can be used to solve any kind of optimization problems. Here, aerodynamic shape optimization problems, including turbomachinery applications, are used to demonstrate the efficiency of the proposed methods. A new computationally demanding application, namely the optimization of a 3D compressor blade is also shown.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alba, E., Tomassini, M.: Parallelism and evolutionary algorithms. IEEE Trans. on Evolutionary Computation 6(5) (2002)

    Google Scholar 

  2. Asouti, V., Zymaris, A., Papadimitriou, D., Giannakoglou, K.: Continuous and discrete adjoint approaches for aerodynamic shape optimization with low Mach number preconditioning. Int. J. for Numerical Methods in Fluids 57(10), 1485–1504 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  3. Auger, A., Hansen, N.: A restart cma evolution strategy with increasing population size. In: CEC 2005, UK, vol. 2, pp. 1769–1776 (2005)

    Google Scholar 

  4. Bäck, T.: Evolutionary Algorithms in Theory and Practice. Evolution Strategies. Evolutionary Programming, Genetic Algorithms. Oxford University Press, Oxford (1996)

    MATH  Google Scholar 

  5. Benoudjit, N., Archambeau, C., Lendasse, A., Lee, J., Verleysen, M.: Width optimization of the Gaussian kernels in radial basis function networks. In: ESANN 2002, Bruges, pp. 425–432 (2002)

    Google Scholar 

  6. Branke, J., Schmidt, C.: Faster convergence by means of fitness estimation. Soft Computing — A Fusion of Foundations, Methodologies & Applications 9(1), 13–20 (2005)

    Google Scholar 

  7. Büche, D., Schraudolph, N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. Trans. on Systems, Man & Cybernetics — Part C: Applications & Reviews 35(2), 183–194 (2005)

    Article  Google Scholar 

  8. Bull, L.: On model-based evolutionary computation. Soft Computing — A Fusion of Foundations, Methodologies & Applications 3(2), 76–82 (1999)

    Article  Google Scholar 

  9. Cantu-Paz, E.: A survey of parallel genetic algorithms. Calculateurs Paralleles, Reseaux et Systemes Repartis 10(2), 141–171 (1998)

    Google Scholar 

  10. Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In: Deb, K., Rudolph, G., Lutton, E., Merelo, J.J., Schoenauer, M., Schwefel, H.-P., Yao, X. (eds.) PPSN 2000. LNCS, vol. 1917, pp. 849–858. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  11. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. Trans. on Evolutionary Computation 6(2), 182–197 (2002)

    Article  Google Scholar 

  12. Désidéri, J., Janka, A.: Hierarchical parameterization for multilevel evolutionary shape optimization with application to aerodynamics. In: EUROGEN 2003, Barcelona (2003)

    Google Scholar 

  13. Doorly, D.J., PeirĂł, J.: Supervised parallel genetic algorithms in aerodynamic optimisation. AIAA Paper 1997-1852 (1997)

    Google Scholar 

  14. Drela, M., Giles, M.: Viscous-inviscid analysis of transonic and low Reynolds number airfoils. AIAA J. 25(10), 1347–1355 (1987)

    Article  MATH  Google Scholar 

  15. Ducheyne, E., De Baets, B., De Wulf, R.: Fitness inheritance in multiple objective evolutionary algorithms: A test bench and real-world evaluation. Applied Soft Computing 8(1), 337–349 (2008)

    Article  Google Scholar 

  16. Duvigneau, R., Chaigne, B., Désidéri, J.: Multi-level parameterization for shape optimization in aerodynamics and electromagnetics using a particle swarm optimization algorithm. Tech. Rep. RR-6003, INRIA, France (2006)

    Google Scholar 

  17. Eby, D., Averill, R., Punch III, W., Goodman, E.: Evaluation of injection island GA performance on flywheel design optimization. In: Proceedings of the 3rd Conf. on Adaptive Computing in Design & Manufacturing, pp. 121–136. Springer, Heidelberg (1998)

    Google Scholar 

  18. Emmerich, M.T.M., Giotis, A., Özdemir, M., Bäck, T., Giannakoglou, K.: Metamodel-assisted evolution strategies. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 361–370. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  19. Emmerich, M., Giannakoglou, K., Naujoks, B.: Single- and multi-objective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. on Evolutionary Computation 10(4), 421–439 (2006)

    Article  Google Scholar 

  20. Farina, M.: A neural network based generalized response surface multiobjective evolutionary algorithm. In: CEC 2002, Honolulu, HI, vol. 1, pp. 956–961 (2002)

    Google Scholar 

  21. Foster, I.: Globus toolkit version 4: Software for service-oriented systems. In: Jin, H., Reed, D., Jiang, W. (eds.) NPC 2005. LNCS, vol. 3779, pp. 2–13. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  22. Fritzke, B.: Fast learning with incremental RBF networks. Neural Processing Letters, 2–5 (1994)

    Google Scholar 

  23. Fritzke, B.: Growing cell structures — A self-organizing network for unsupervised and supervised learning. Neural Networks 7(9), 1441–1460 (1994)

    Article  Google Scholar 

  24. Georgopoulou, C., Giannakoglou, K.: A multi-objective metamodel-assisted memetic algorithm with strength-based local refinement. Engineering Optimization Accepted for publication (to appear, 2009)

    Google Scholar 

  25. Giannakoglou, K.: Designing turbomachinery blades using evolutionary methods. ASME Paper 99-GT-181 (1999)

    Google Scholar 

  26. Giannakoglou, K.: Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence. Progress in Aerospace Sciences 38(1), 43–76 (2002)

    Article  Google Scholar 

  27. Giannakoglou, K.: The EASY (Evolutionary Algorithms System) software (2008), http://velos0.ltt.mech.ntua.gr/EASY

  28. Giannakoglou, K., Georgopoulou, C.: Multiobjective metamodel-assisted memetic algorithms. In: Multiobjective Memetic Algorithms. Studies in Computational Intelligence. Springer, Heidelberg (2009)

    Google Scholar 

  29. Giannakoglou, K., Giotis, A., Karakasis, M.: Low-cost genetic optimization based on inexact pre-evaluations and the sensitivity analysis of design parameters. Inverse Problems in Engineering 9, 389–412 (2001)

    Article  Google Scholar 

  30. Giannakoglou, K., Papadimitriou, D., Kampolis, I.: Aerodynamic shape design using evolutionary algorithms and new gradient-assisted metamodels. Computer Methods in Applied Mechanics & Engineering 195, 6312–6329 (2006)

    Article  MATH  Google Scholar 

  31. Giannakoglou, K., Kampolis, I., Georgopoulou, C.: Metamodel-assisted evolutionary algorithms (MAEAs). In: Introduction to Optimization and Multidisciplinary Design in Aeronautics and Turbomachinery, Lecture Series, von Karman Institute, Rhodes–Saint Genése (2008)

    Google Scholar 

  32. Giotis, A., Giannakoglou, K.: Single- and multi-objective airfoil design using genetic algorithms and artificial intelligence. In: EUROGEN 1999, Jyväskylä (1999)

    Google Scholar 

  33. Giotis, A., Giannakoglou, K., PĂ©riaux, J.: A reduced-cost multi-objective optimization method based on the Pareto front technique, neural networks and PVM. In: ECCOMAS 2000, Barcelona (2000)

    Google Scholar 

  34. Goldberg, D.: Genetic Algorithms in Search, Optimization & Machine Learning. Addison-Wesley, Reading (1989)

    MATH  Google Scholar 

  35. Greenman, R., Roth, K.: Minimizing computational data requirements for multi-element airfoils using neural networks. AIAA Paper 1999-0258 (1999)

    Google Scholar 

  36. Haykin, S.: Neural Networks - A Comprehensive Foundation, 2nd edn. Prentice Hall, Englewood Cliffs (1999)

    MATH  Google Scholar 

  37. Herrera, F., Lozano, M., Moraga, C.: Hierarchical distributed genetic algorithms. Int. J. of Intelligent Systems 14(9), 1099–1121 (1999)

    Article  MATH  Google Scholar 

  38. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. on Evolutionary Computation 6(5), 481–494 (2002)

    Article  Google Scholar 

  39. Kampolis, I., Giannakoglou, K.: A multilevel approach to single- and multiobjective aerodynamic optimization. Computer Methods in Applied Mechanics & Engineering 197, 2963–2975 (2008)

    Article  Google Scholar 

  40. Kampolis, I., Giannakoglou, K.: Distributed evolutionary algorithms with hierarchical evaluation. Engineering Optimization Accepted for publication (to appear, 2009)

    Google Scholar 

  41. Kampolis, I., Karangelos, E., Giannakoglou, K.: Gradient-assisted radial basis function networks: theory and applications. Applied Mathematical Modelling 28(13), 197–209 (2004)

    Article  MATH  Google Scholar 

  42. Kampolis, I., Papadimitriou, D., Giannakoglou, K.: Evolutionary optimization using a new radial basis function network and the adjoint formulation. Inverse Problems in Science & Engineering 14(4), 397–410 (2006)

    Article  MATH  Google Scholar 

  43. Kampolis, I., Zymaris, A., Asouti, V., Giannakoglou, K.: Multilevel optimization strategies based on metamodel-assisted evolutionary algorithms, for computationally expensive problems. In: CEC 2007, Singapore (2007)

    Google Scholar 

  44. Karakasis, M., Giannakoglou, K.: On the use of metamodel-assisted, multi-objective evolutionary algorithms. Engineering Optimization 38(8), 941–957 (2006)

    Article  MathSciNet  Google Scholar 

  45. Karakasis, M., Giotis, A., Giannakoglou, K.: Inexact information aided, low-cost, distributed genetic algorithms for aerodynamic shape optimization. Int. J. for Numerical Methods in Fluids 43(10-11), 1149–1166 (2003)

    Article  MATH  Google Scholar 

  46. Karakasis, M., Koubogiannis, D., Giannakoglou, K.: Hierarchical distributed evolutionary algorithms in shape optimization. Int. J. for Numerical Methods in Fluids 53(3), 455–469 (2007)

    Article  MATH  Google Scholar 

  47. Karayiannis, N., Mi, G.: Growing radial basis neural networks: Merging supervised and unsupervised learning with network growth techniques. IEEE Trans. on Neural Networks 8(6), 1492–1506 (1997)

    Article  Google Scholar 

  48. Keane, A., Nair, P.: Computational Approaches for Aerospace Design – The Pursuit of Excellence. John Wiley & Sons, Ltd., Chichester (2005)

    Book  Google Scholar 

  49. Knowles, J., Corne, D.: M-PAES: A memetic algorithm for multiobjective optimization. In: CEC 2000, pp. 325–332. IEEE Press, Los Alamitos (2000)

    Google Scholar 

  50. Lambropoulos, N., Koubogiannis, D., Giannakoglou, K.: Acceleration of a Navier-Stokes equation solver for unstructured grids using agglomeration multigrid and parallel processing. Computer Methods in Applied Mechanics & Engineering 193, 781–803 (2004)

    Article  MATH  Google Scholar 

  51. Langdo, W., Poli, R.: Evolving problems to learn about particle swarm and other optimisers. In: CEC 2005, UK, pp. 81–88 (2005)

    Google Scholar 

  52. Liakopoulos, P., Kampolis, I., Giannakoglou, K.: Grid-enabled, hierarchical distributed metamodel-assisted evolutionary algorithms for aerodynamic shape optimization. Future Generation Computer Systems 24, 701–708 (2008)

    Article  Google Scholar 

  53. Lim, D., Ong, Y.S., Jin, Y., Sendhoff, B., Lee, B.S.: Efficient hierarchical parallel genetic algorithms using grid computing. Future Generation Computer Systems 23(4), 658–670 (2007)

    Article  Google Scholar 

  54. Lin, S.C., Punch, W., Goodman, E.: Coarse-grain parallel genetic algorithms: categorization and new approach. In: 6th IEEE Symposium on Parallel & Distributed Processing, Dallas, pp. 28–37 (1994)

    Google Scholar 

  55. Massie, M., Chun, B., Culler, D.: The Ganglia distributed monitoring system: Design, implementation, and experience. Parallel Computing 30(7) (2004)

    Google Scholar 

  56. Mathioudakis, K., Papailiou, K., Neris, N., Bonhommet, C., Albrand, G., Wenger, U.: An annular cascade facility for studying tip clearance effects in high speed flows. In: XIII ISABE Conf., Chattanooga, TN (1997)

    Google Scholar 

  57. Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs, 3rd edn. Springer, Heidelberg (1996)

    MATH  Google Scholar 

  58. Michalewicz, Z., Fogel, D.: How to Solve it: Modern Heuristics, 2nd edn. Springer, Heidelberg (2004)

    MATH  Google Scholar 

  59. Montero, R., Huedo, E., Llorente, I.: A framework for adaptive execution on grids. J. of Software - Practice & Experience 34, 631–651 (2004)

    Article  Google Scholar 

  60. Moody, J., Darken, C.: Fast learning in networks of locally-tuned processing units. Neural Computation 1(2), 281–294 (1989)

    Article  Google Scholar 

  61. Muyl, F., Dumas, L., Herbert, V.: Hybrid method for aerodynamic shape optimization in automotive industry. Computers & Fluids 33(5-6), 849–858 (2004)

    Article  MATH  Google Scholar 

  62. Nakayama, H., Inoue, K., Yoshimori, Y.: Approximate optimization using computational intelligence and its application to reinforcement of cable-stayed bridges. In: ECCOMAS 2004, Jyväskylä (2004)

    Google Scholar 

  63. Nocedal, J., Wright, S.: Numerical Optimization. Springer, Heidelberg (1999)

    Book  MATH  Google Scholar 

  64. Nowostawski, M., Poli, R.: Parallel genetic algorithm taxonomy. In: KES 1999, pp. 88–92 (1999)

    Google Scholar 

  65. Ong, Y., Lum, K., Nair, P.: Hybrid evolutionary algorithm with Hermite radial basis function interpolants for computationally expensive adjoint solvers. Computational Optimization & Applications 39(1), 97–119 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  66. Ong, Y.S., Lum, K.Y., Nair, P., Shi, D., Zhang, Z.K.: Global convergence of unconstrained and bound constrained surrogate-assisted evolutionary search in aerodynamic shape design. In: CEC 2003, Canberra, vol. 3, pp. 1856–1863 (2003)

    Google Scholar 

  67. Ong, Y.S., Nair, P., Keane, A.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J. 41(4), 687–696 (2003)

    Article  Google Scholar 

  68. Ong, Y.S., Lim, M., Zhu, N., Wong, K.: Classification of adaptive memetic algorithms: A comparative study. IEEE Trans. on Systems Man & Cybernetics - Part B 36, 141–152 (2006)

    Google Scholar 

  69. Papadimitriou, D., Giannakoglou, K.: A continuous adjoint method with objective function derivatives based on boundary integrals for inviscid and viscous flows. Computers & Fluids 36, 325–341 (2007)

    Article  MATH  Google Scholar 

  70. Papadimitriou, D., Giannakoglou, K.: Total pressure loss minimization in turbomachinery cascades using a new continuous adjoint formulation. J. of Power & Energy (Part A) 221, 865–872 (2007)

    Article  Google Scholar 

  71. Papadrakakis, M., Lagaros, N.D., Tsompanakis, Y.: Structural optimization using evolution strategies and neural networks. Computer Methods in Applied Mechanics & Engineering 156(1-4), 309–333 (1998)

    Article  MATH  Google Scholar 

  72. Piegl, L., Tiller, W.: The NURBS Book, 2nd edn. Springer, Heidelberg (1997)

    Google Scholar 

  73. Pierret, S., Van den Braembussche, R.: Turbomachinery blade design using a Navier-Stokes solver and artificial neural network. ASME J. of Turbomachinery 121(2), 326–332 (1999)

    Article  Google Scholar 

  74. Poggio, T., Girosi, F.: Networks for approximation and learning. Proceedings of the IEEE 78(9), 1481–1497 (1990)

    Article  Google Scholar 

  75. Politis, E., Giannakoglou, K., Papailiou, K.: High–speed flow in an annular cascade with tip clearance: Numerical investigation. ASME Paper 98-GT-247 (1998)

    Google Scholar 

  76. Poloni, C., Giurgevich, A., Onesti, L., Pediroda, V.: Hybridization of a multiobjective genetic algorithm, a neural network and a classical optimizer for a complex design problem in fluid dynamics. Computer Methods in Applied Mechanics & Engineering 186(2), 403–420 (2000)

    Article  MATH  Google Scholar 

  77. Ratle, A.: Optimal sampling strategies for learning a fitness model. In: CEC 1999, Washington, DC, vol. 3, pp. 2078–2085 (1999)

    Google Scholar 

  78. Sefrioui, M., Périaux, J.: A hierarchical genetic algorithm using multiple models for optimization. In: Deb, K., Rudolph, G., Lutton, E., Merelo, J.J., Schoenauer, M., Schwefel, H.-P., Yao, X. (eds.) PPSN 2000. LNCS, vol. 1917, pp. 879–888. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  79. Smith, R., Dike, B., Stegmann, S.: Fitness inheritance in genetic algorithms. In: SAC 1995, pp. 345–350. ACM, New York (1995)

    Chapter  Google Scholar 

  80. Spalart, P., Allmaras, S.: A one-equation turbulence model for aerodynamic flows. AIAA Paper 92-0439 (1992)

    Google Scholar 

  81. Thain, D., Tannenbaum, T., Livny, M.: Distributed computing in practice: the Condor experience. Concurrency - Practice and Experience 17(2-4), 323–356 (2005)

    Article  Google Scholar 

  82. Thévenin, D., Janiga, G.: Optimization and Computational Fluid Dynamics. Springer, Heidelberg (2008)

    Book  MATH  Google Scholar 

  83. Ulmer, H., Streichert, F., Zell, A.: Evolution strategies assisted by Gaussian processes with improved pre-selection criterion. In: CEC 2003, Canberra, vol. 1, pp. 692–699 (2003)

    Google Scholar 

  84. Zhou, Z., Ong, Y.S., Lim, M., Lee, B.: Memetic algorithm using multi–surrogates for computational expensive optimization problems. Soft Computing 11(10), 957–971 (2007)

    Article  Google Scholar 

  85. Zitzler, E., Laumans, M., Thiele, L.: SPEA2: Improving the strength Pareto evolutionary algorithm. Tech. Rep. 103, ETH, Computer Engineering & Communication Networks Lab. (TIK), Zurich (2001)

    Google Scholar 

  86. Zitzler, E., Laumans, M., Thiele, L.: SPEA2: Improving the strength Pareto evolutionary algorithm for multiobjective optimization. In: EUROGEN 2001, CIMNE, Barcelona, pp. 19–26 (2001)

    Google Scholar 

  87. Zitzler, E., Brockhoff, D., Thiele, L.: The hypervolume indicator revisited: On the design of pareto-compliant indicators via weighted integration. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 862–876. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Giannakoglou, K.C., Kampolis, I.C. (2010). Multilevel Optimization Algorithms Based on Metamodel- and Fitness Inheritance-Assisted Evolutionary Algorithms. In: Tenne, Y., Goh, CK. (eds) Computational Intelligence in Expensive Optimization Problems. Adaptation Learning and Optimization, vol 2. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10701-6_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-10701-6_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-10700-9

  • Online ISBN: 978-3-642-10701-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics