Témata prací (Výběr práce)Témata prací (Výběr práce)(verze: 368)
Detail práce
   Přihlásit přes CAS
Evoluční algoritmy a náhradní modelování
Název práce v češtině: Evoluční algoritmy a náhradní modelování
Název v anglickém jazyce: Evolutionary algorithms and surrogate modelling
Akademický rok vypsání: 2024/2025
Typ práce: disertační práce
Jazyk práce:
Ústav: Katedra teoretické informatiky a matematické logiky (32-KTIML)
Vedoucí / školitel: prof. RNDr. Ing. Martin Holeňa, CSc.
Řešitel:
Zásady pro vypracování
Artificial neural networks of the kinds radial basis functions network, and partially also of the kind multilayer perceptron, used to be a frequent choice as surrogate models some 10-20 years ago [8, 13, 21, 27]. However, that was before the advent of deep learning, and we are not aware of any research of the applicability of deep neural networks as surrogate models. In our opinion, they could be substantially better applicable to this end than radial basis function networks and multilayer perceptrons due to their much higher exibility. Particularly attractive is the possibility to define network layers with very specific pose, tailored to very specific functionality. In the context of surrogate modelling, this in particular suggests to create layers with functionality corresponding to successful kinds of surrogate models, such as low-degree polynomials or Gaussian processes, and to use them as top layers of deep neural networks. This can be viewed as hybridizing the surrogate model corresponding to the top layer with the neural network corresponding to the lower layers. The investigation of such hybrid models and elaboration of methods for training them is the topic of our research.
Seznam odborné literatury
[1] A. Auger, M. Schoenauer, and N. Vanhaecke. LS-CMA-ES: A second-order algorithm for covariance matrix adaptation. In Parallel Problem Solving from Nature - PPSN VIII, pages 182-191, 2004.
[2] L. Bajer, Z. Pitra, J. Repický, and M. Holeňa. Gaussian process surrogate models for the CMA evolution strategy. Evolutionary Computation, 27:665-697, 2019.
[3] A. Brooker, J. Dennis, P. Frank, and D. Serafini. A rigorous framework for optimization by surrogates. Structural and Multidisciplinary Optimization, 17:1-13, 1998.
[4] D. Büche, N.N. Schraudolph, and P. Koumoutsakos. Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 35:183-194, 2005.
[5] K. Deb and P.K.S. Nain. An evolutionary multi-objective adaptive meta-modeling procedure using artificial neural networks. In Evolutionary Computation in Dynamic and Uncertain Environments, pages 297-322. Springer, 2007.
[6] M. Emmerich, K. Giannakoglou, and B. Naujoks. Single- and multi-objective evolutionary optimization assisted by Gaussian random field metamodels,. IEEE Transactions on Evolutionary Computation, 10:421-439, 2006.
[7] D. Gorissen, I. Couckuyt, E. Laermans, and T. Dhaene. Pareto-based muilti-output metamodeling with active learning. In D.P. Brown, C. Draganova, E. Pimenidis, and A. Mouratidis, editors, Engineering Applications of Neural Networks, pages 389-400. Springer, 2009.
[8] H.M. Gutmann. A radial basis function method for global optimization. Journal of Global Optimization, 19:201-227, 2001.
[9] N. Hansen. The CMA evolution strategy: A comparing review. In Towards a New Evolutionary Computation, pages 75-102. Springer, 2006.
[10] N. Hansen. A global surrogate assisted CMA-ES. In GECCO'19, pages 664-672, 2019.
[11] N. Hansen and A. Ostermaier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9:159-195, 2001.
[12] S. Hosder, L. Watson, and B. Grossman. Polynomial response surface approximations for the multidisciplinary design optimization of a high speed civil transport. Optimization and Engineering, 2:431-452, 2001.
[13] Y. Jin, M. Hüsken, M. Olhofer, and Sendhoff B. Neural networks for fitness approximation in evolutionary optimization. In Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, pages 281-306. Springer, 2005.
[14] Y. Jin, M. Olhofer, and B. Sendhoff. A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation, 6:481-494, 2002.
[15] S. Kern, N. Hansen, and P. Koumoutsakos. Local metamodels for optimization using evolution strategies. In PPSN IX, pages 939-948, 2006.
[16] D. Lim, Y.C. Jin, Y.S. Ong, and B. Sendhoff. Generalizing surrogate-assisted evolutionary computation. IEEE Transactions on Evolutionary Computation, 14:329-355, 2010.
[17] I. Loshchilov, M. Schoenauer, and M. Sebag. Intensive surrogate model exploitation in selfadaptive surrogate-assisted CMA-ES (saACM-ES). In GECCO'13, pages 439-446, 2013.
[18] H. Mohammadi, R.L. Riche, and E. Touboul. Making ego and cma-es complementary for global optimization. In Learning and Intelligent Optimization, pages 287-292. Springer, 2015.
[19] R.H. Myers, D.C. Montgomery, and C.M. Anderson-Cook. Response Surface Methodology: Proces and Product Optimization Using Designed Experiments. John Wiley and Sons, Hoboken, 2009.
[20] Y.S. Ong, P.B. Nair, A.J. Keane, and K.W. Wong. Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. In Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, pages 307-331. Springer, 2005.
[21] R.C. Regis. Stochastic radial basis function algorithms for large-scale optimization involving expensive black-box objective and constraint functions. Computers and Operations Research, 38:837-853, 2011.
[22] A.K. Talukder, M. Kirley, and R. Buyya. The pareto-following variation operator as an alternative approximation model. In CEC '09: IEEE Congress on Evolutionary Computation, pages 8-15, 2009.
[23] L. Toal and D.V. Arnold. Simple surrogate model assisted optimization with covariance matrix adaptation. In PPSN XVI., pages 184-197, 2020.
[24] H. Ulmer, F. Streichert, and A. Zell. Model assisted evolution strategies. In Y. Jin, editor, Knowledge Incorporation in Evolutionary Computation, pages 333-355. Springer, 2005.
[25] V. Volz, G. Rudolph, and B. Naujoks. Investigating uncertainty propagation in surrogateassisted evolutionary algorithms. In GECCO'17, pages 881-888, 2017.
[26] I. Voutchkov and A. Keane. Multiobjective optimization using surrogates. In ACDM 2006, pages 167-175, 2006.
[27] Z.Z. Zhou, Y.S. Ong, P.B. Nair, A.J. Keane, and K.Y. Lum. Combining global and local surrogate models to accellerate evolutionary optimization. IEEE Transactions on Systems, Man and Cybernetics. Part C: Applications and Reviews, 37:66-76, 2007.
Předběžná náplň práce v anglickém jazyce
Evolutionary algorithms are, in the last 20 years, one of the most successful methods for solving non-traditional optimization problems, such as search for the most suitable documents containing required information, discovery of the most interesting knowledge in available data, or other kinds of optimization tasks in which the values of the objective function can be obtained only empirically. Because evolutionary algorithms employ only function values of the objective function, they approach its optimum much more slowly than optimization methods for smooth functions, which make use of information about the objective function gradients as well, possibly also about its second derivatives. This property of evolutionary algorithms is particularly disadvantageous in the context of costly and time-consuming empirical way of obtaining values of the objective function. However, evolutionary algorithms can be substantially speeded up if they employ the empirical objective function only sometimes when evaluating objective function values, whereas they mostly evaluate only a sufficiently accurate regression model of that function, aka its surrogate model. Surrogate modelling originated in the area of design of experiments, where it is traditionally called response-surface modelling [12, 19], and has also been used with conventional optimization [3], but is most often combined with evolutionary optimization [2, 4, 10, 14, 27]. If the original objective function is evaluated empirically, the empirically obtained value is usually viewed as a realization of a random variable, also governed by some unknown model. That is why the surrogate models approximating objective functions are sometimes called metamodels [6, 7]. How much is which surrogate model suitable to approximate a particular objective function depends primarily on that function, but also on the employed optimization method. In traditional response-surface modelling, quadratic or at most cubic polynomials were used, and they seem to be still the most suitable kind of surrogate models if the objective function is not too involved [1, 10, 15]. Among more advanced surrogate models, most often used are probably Gaussian processes [2, 4, 18, 23, 25], which approximate not only the mean value of the objective function, but the whole distribution of its values. In connection with the state-of-the-art blackbox optimization method, the covariance matrix adaptation evolution strategy (CMA-ES) [11, 9], also ranking support vector machines were successfully used [17], sharing with CMA-ES invariance to monotone transformations of the objective function. Finally, specifically adapted surrogate models are needed for multi-objective optimization [5, 6, 16, 22, 26]. A crucial aspect of using any surrogate model with any optimization algorithm is the control of incorporating model predictions into the evolution [20, 24, 27]. In the early era of surrogate modelling, simple generation control was typically used, in which several generations based on model predictions were interleaved with a generation evaluating the original objective function. Nowadays, the state of the art is to consider, when choosing points for empirical evaluation, the value of the empirical function, as well as how they contribute, during model re-learning, to making it more accurate. Such an approach is termed active learning.
 
Univerzita Karlova | Informační systém UK