作者: Francesco Romor , Marco Tezzele , Gianluigi Rozza
DOI:
关键词:
摘要: Gaussian processes are employed for non‐parametric regression in a Bayesian setting. They generalize linear regression embedding the inputs in a latent manifold inside an infinite‐dimensional reproducing kernel Hilbert space. We can augment the inputs with the observations of low‐fidelity models in order to learn a more expressive latent manifold and thus increment the model's accuracy. This can be realized recursively with a chain of Gaussian processes with incrementally higher fidelity. We would like to extend these multi‐fidelity model realizations to case studies affected by an high‐dimensional input space but with a low intrinsic dimensionality. In this cases physical supported or purely numerical low‐order models are still affected by the curse of dimensionality when queried for responses. When the model's gradients information is provided, the presence of an active subspace can be exploited to design …