Abstract
The goal of this paper is to model hypothesis testing. A “real situation” is given in the form of a response surface, which is defined by a derivative-free continuous expensive objective function. An ideal hypothesis should correspond to a global minimum of this function. Thus, hypothesis testing is converted into optimization of a response surface. First, an objective function is evaluated at a few points. Then, the hypothetical (surrogate) surface landscape is created from an ensemble of approximations of the objective function. Approximations result from neural networks, which use already evaluated samples as the training set. The hypothesis landscape adapted by a merit function estimates a possibility of getting at a given point a better value, than is the currently achieved value from already evaluated points. The most promising point (a minimum of the adapted function) is used as the next sample point for the true expensive objective function. Its value is then used to adapt neural networks, creating a new hypothesis landscape. The results suggest that (1) in order to get a global minimum, it may be useful to have an estimation of the whole response surface, and therefore to explore also those points, where maxima are predicted, and (2) an assembly of modules predicting the next sample point from the same set of sample points can be more advantageous than a single neural network predictor.
This work was supported by grants # 1/7336/20 and # 1/5229/98 of the work of the Slovak Republic Grant Agency.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Fodor, J.A.: The Modularity of Mind. Cambridge, MA: MIT Press 1983.
Opitz, D. Shavlik, J.: A Genetic Algorithm Approach for Creating Neural Network Ensembles. In: Sharkey A. (ed.): Combining Artificial Neural Nets, pp. 79–97. London: Springer 1999.
Krogh, A. Vedelsby J.: Neural Network Ensembles, Cross Validation and Active Learning. In: Touretzky, D.S. Tesauro, G. Leen, T.K., (eds.): Adv. Neural Inf. Process. Syst., 7,231–238 (199
Wolpert, D.H.: Stacked Generalization. Neural Networks, 5, 241–259 (1992).
Perrone, M.: Improving regression estimation: averag ing methods for variance reduction with extensions to general convex measure optimization. PhD thesis, Brown Univ., Phys. Dept. 1993.
Alpaydin, E.: Techniques for Combining Multiple Learners. In: Alpaydin, E. (ed.): Proc. Eng. Intell. Syst. ’98, vol. 2, pp. 6–12. ICSC Press 1998.
Tyler, L.K., Marslen-Wilson W.D.: Conjectures and refutations: a reply to Norris. Cognition, 11, 103–107 (1982).
Bull, L.: On Model-Based Evolutionary Computation. Soft Comp., 3, 76–82 (1999).
Jin, Y, Olhofer, M., Sendhoff, B.: On Evolutionary Optimization with Approximate Fitness Function. In: Whitley, D., Goldberg, D., Cantú-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.): GECCO-2000 Proc., pp. 786–793. M. Kaufmann Pub. 2000.
Jones, D. Schonlau, M. Welch, W.: Efficient Global Optimization of Expensive Black-Box Functions. J. Global Optim., 14, 455–492 (1998).
Booker, A.J. Dennis, J.E. Jr. Frank, P.D. Serafini, D.B. Torczon, V. Trosset, M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Optim., 17, 1–13 (1999).
Torczon, V. Trosset M.W.: Using approximations to accelerate engineering design optimization. In: Proc. 7th Symp. Multidisc. Anal. Optim., St. Louis, MO, Sept. 2–4, 1998.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Wien
About this paper
Cite this paper
Pospíchal, J. (2001). Optimization of Expensive Functions by Surrogates Created from Neural Network Ensembles. In: Kůrková, V., Neruda, R., Kárný, M., Steele, N.C. (eds) Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6230-9_11
Download citation
DOI: https://doi.org/10.1007/978-3-7091-6230-9_11
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83651-4
Online ISBN: 978-3-7091-6230-9
eBook Packages: Springer Book Archive