Building in dimensionality reduction into Gaussian process regression models
Gaussian process regression (GPR) is an immensely popular choice for computational scientists as a surrogate model for various uncertainty quantification tasks. However, it’s appliacability is limited by it’s poor scalability to high input dimensions. In this I demonstrate a methodology for scaling GPR to high-dimensional stochastic parameter spaces by recovering the active subspace of the quantity of interest. To do this, this active subspace projection matrix is posed as the hyperparameter of the Gaussian process covariance kernel and it’s entries are learned by maximizing the likelihood of the data.