This was part of
Kernel Methods in Uncertainty Quantification and Experimental Design
Building GP Surrogate Model with High-Dimensional Input
Lulu Kang, University of Massachusetts, Amherst
Friday, April 4, 2025
Abstract: Gaussian process (GP) regression is a popular surrogate modeling tool for computer simulations in engineering and scientific domains. However, it often struggles with high computational costs and low prediction accuracy when the simulation involves too many input variables. In this talk, I will present two different approaches to build Gaussian process surrogate model for experiments with high dimensional input. I first introduce an optimal kernel learning approach to identify the active variables, thereby overcoming GP model limitations and enhancing system understanding. This method approximates the original GP model's covariance function through a convex combination of kernel functions, each utilizing low-dimensional subsets of input variables. The second approach is Bayesian bridge GP regression approach, in which we impose shrinkage penalty on the linear regression coefficients of the mean and correlation coefficients in the covariance function. This is equivalent to using certain proper informative priors on these parameters under Bayesian framework. Using Spherical Hamiltonian Monte Carlo, we can directly sample from the constrained posterior distribution without the restrictions on prior distribution as in Bayesian bridge regression.