New Approach to Adjusting the Objective Function Gaussian Surrogate Model in the Problem of Design Solution Parametric Optimization

Authors: Agasiev T.A., Gvozdev N.P., Karpenko A.P., Pivovarova N.V. Published: 28.09.2023
Published in issue: #3(144)/2023  
DOI: 10.18698/0236-3933-2023-3-62-83

Category: Informatics, Computer Engineering and Control | Chapter: System Analysis, Control, and Information Processing  
Keywords: parametric optimization, surrogate simulation, Bayesian approach to optimization, hyper-parameters


The paper considers methods for solving the problem of design solution parametric optimization based on constructing the Gaussian surrogate model of this problem objective function. The problem is set of finding optimal values of the surrogate model free parameters (hyper-parameters), it is called the problem of its adjustment. The adjustment problem is built over the top of the surrogate model synthesis problem and has a higher computational complexity. The approach to adjusting a surrogate model is proposed, which is able to make the adjustment procedure acceptable in terms of the computational costs. This approach includes the setup and operation stages. The adjustment stage contains the following main steps: formation of a set of test objective functions; generation of a set of learning samples for each of them; determination of their characteristic features values for the generated samples; determination of the hyper-parameters optimal values for all considered test functions and learning samples; formation of a set of pairs, characteristic features of the sample--hyper-parameters optimal values; building on this basis a predictive model forecasting the hyper-parameters optimal values according to the learning sample characteristic features. For the initial problem at the operation stage, a learning sample was generated, its characteristic features were determined, and the hyper-parameters optimal values of the surrogate model were predicted. Based on the specified learning sample, the objective function surrogate model was synthesized. Using the surrogate model, the original optimization problem was solved, where the hyper-parameters predictive values were applied as the optimal values. The approach is able to provide an increase of up to 30 % in efficiency of the basic optimization algorithm

Please cite this article in English as:

Agasiev T.A., Gvozdev N.P., Karpenko A.P., et al. New approach to adjusting the objective function Gaussian surrogate model in the problem of design solution parametric optimization. Herald of the Bauman Moscow State Technical University, Series Instrument Engineering, 2023, no. 3 (144), pp. 62--83 (in Russ.). DOI: https://doi.org/10.18698/0236-3933-2023-3-62-83


[1] Zakharova E.M., Minashina I.K. Review of multidimensional optimization techniques. Informatsionnye protsessy [Information Processes], 2014, vol. 14, no. 3, pp. 256--274 (in Russ.).

[2] Liu B., Koziel S., Zhang Q. A multi-fidelity surrogate-model-assisted evolutionary algorithm for computationally expensive optimization problems. J. Comput. Sc., 2016, vol. 12, pp. 28--37.

[3] Muller J., Shoemaker C.A. Influence of ensemble surrogate models and sampling strategy on the solution quality of algorithms for computationally expensive black-box global optimization problems. J. Glob. Optim., 2014, vol. 60, no. 2, pp. 123--144. DOI: https://doi.org/10.1007/s10898-014-0184-0

[4] Kuleshov A.P. Cognitive technologies in adaptive models of complex objects. Informatsionnye tekhnologii i vychislitelnye sistemy, 2008, no. 1, pp. 18--29 (in Russ.).

[5] Sunchalin A.M., Sunchalina A.L. Overview of methods and models for forecasting financial time series. Khronoekonomika [Hronoeconomics], 2020, no. 1 (in Russ.). Available at: http://hronoeconomics.ru/01_2020.pdf

[6] Buhmann M.D. Radial basis functions. Cambridge, Cambridge University Press, 2009.

[7] Snoek J., Rippel O., Swrsky K., et al. Scalable Bayesian optimization using deep neural networks. PMLR, 2015, vol. 37, pp. 2171--2180.

[8] Terekhov S.A. [Random Gaussian processes in data approximation problems]. X Vseros. nauch.-tekh. konf. "Neyroinformatika--2008". Lektsii po neyroinformatike. Ch. 1 [X. Russ. Sc.-Tech. Conf. Neuroinformatics--2008. Lectures on Neuroinformatics. P. 1]. M., MEPhl, 2008, pp. 126--151 (in Russ.).

[9] Binois M., Wycoff N. A survey on high-dimensional Gaussian process modeling with application to Bayesian optimization. ACM TELO, 2022, vol. 2, no. 2, art. 8. DOI: http://dx.doi.org/10.1145/3545611

[10] Luca F., Donini M., Frasconi P., et al. Forward and reverse gradient-based hyperparameter optimization. Proc. 34th Int. Conf. on Machine Learning, 2017, vol. 70, pp. 1165--1173.

[11] Smirnova V.S., Shalamova V.V., Efimova V.A., et al. Hyperparameter optimization based on a priori and a posteriori knowledge about classification problem. Nauchno-tekhnicheskiy vestnik informatsionnykh tekhnologiy, mekhaniki i optiki [Sc. Tech. J. Inf. Technol. Mech. Opt.], 2020, vol. 20, no. 6, pp. 828--834 (in Russ.). DOI: https://doi.org/10.17586/2226-1494-2020-20-6-828-834

[12] Karpenko A.P., Kuzmina I.A. Structural and parametric synthesis of population algorithms for global optimization. Procedia Comput. Sc., 2021, vol. 186, no. 2, pp. 299--308. DOI: https://doi.org/10.1016/j.procs.2021.04.207

[13] Mersmann O., Bischl B., Trautmann H., et al. Exploratory landscape analysis. Proc. GECCO’11, 2011, pp. 829--836. DOI: https://doi.org/10.1145/2001576.2001690

[14] Kerschke P., Trautmann H. The R-Package FLACCO for exploratory landscape analysis with applications to multi-objective optimization problems. IEEE CEC, 2016, pp. 5262--5269. DOI: https://doi.org/10.1109/CEC.2016.7748359

[15] Fowkes J., Roberts L., Burmen A. PyCUTEst: an open source Python package of optimization test problems. Open Source Softw., 2022, vol. 7, no. 78, art. 4377. DOI: https://doi.org/10.21105/joss.04377

[16] Roy M.H., Larocque D. Robustness of random forests for regression. J. Nonparametr. Stat., 2012, vol. 24, no. 4, pp. 993--1006. DOI: https://doi.org/10.1080/10485252.2012.715161

[17] Nasledov A. IBM SPSS Statistics 20 i AMOS: professionalnyy statisticheskiy analiz dannykh [IBM SPSS Statistics 20 and AMOS: professional statistical data analysis]. St. Petersburg, Piter Publ., 2013.

[18] Gutmann H.M. A radial basis function method for global optimization. J. Glob. Optim., 2001, vol. 19, no. 3, pp. 201--227. DOI: https://doi.org/10.1023/A:1011255519438

[19] De Cock D.R. Kriging as an alternative to polynomial regression in response surface analysis. Ames, Iowa State University, 2003.

[20] Kolodyazhnyy M., Zaytsev A. [Heteroscedastic Gaussian processes and their application to Bayesian optimisation]. Tr. 42-y Mezhdisciplinarnoy shk.-konf. IPPI RAN "ITiS 2018" [Proc. 42nd Interdisciplinary School-Conf. IPPI RAS "ITiS 2018"]. Moscow, IPPI RAS Publ., 2018, pp. 42--51 (in Russ.).

[21] Barton R.R. Metamodeling: a state of the art review. Proc. Winter Simulation Conf., Philadelphia, Pennsylvania SU, 1994, pp. 237--244.DOI: https://doi.org/10.1109/WSC.1994.717134

[22] Jones D.R. A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim., 2001, no. 21, pp. 345--383. DOI: https://doi.org/10.1023/A:1012771025575