Gradient-free active subspace recovery using deep neural networks

Leveraging information for multiple sources to construct accurate surrogate models

Abstract

A problem of considerable importance within the field of uncertainty quantification (UQ) is the development of efficient methods for the construction of accurate surrogate models. Such efforts are particularly important to applications constrained by high-dimensional uncertain parameter spaces. The diffi- culty of accurate surrogate modeling in such systems, is fur- ther compounded by data scarcity brought about by the large cost of forward model evaluations. Traditional response sur- face techniques, such as Gaussian process regression (or Krig- ing) and polynomial chaos are difficult to scale to high dimen- sions. To make surrogate modeling tractable in expensive high- dimensional systems, one must resort to dimensionality reduc- tion of the stochastic parameter space. A recent dimension- ality reduction technique that has shown great promise is the method of ‘active subspaces’. The classical formulation of ac- tive subspaces, unfortunately, requires gradient information from the forward model - often impossible to obtain. In this work, we present a simple, scalable method for recovering active sub- spaces in high-dimensional stochastic systems, without gradient- information that relies on a reparameterization of the orthogonal active subspace projection matrix, and couple this formulation with deep neural networks. We demonstrate our approach on synthetic and real world datasets and show favorable predictive comparison to classical active subspaces.

Date
Aug 21, 2019
Event
ASME IDETC-CIE
Location
Anaheim, CA

Related