作者: Gabriele Abbati
DOI:
关键词:
摘要: For decades, the holy grail of many fields of science, mathematics and statistics has been the ability to predict the future or, more quantitatively, to forecast the behaviours of quantities of interest. Examples range from predicting the value of financial products to forecasting energy consumption and several other applications. While historically practitioners have focused on designing handcrafted methods for specific prediction tasks, the tools of Machine Learning (ML) promise to automate data analyses that extract patterns and structures and leverage this information in future forecasts. The addition of the probabilistic framework of Bayesian Machine Learning enables the inclusion of useful prior knowledge and provides predictive uncertainty quantification. This thesis focuses on generative modeling, an important class of algorithms that stands at the core of Bayesian ML: by making structural assumptions on the generative processes that produced the data, generative models intend to explain the observed variability and yield a deep, interpretable understanding of data. We present three original contributions, which leverage model-based machine learning to gain insights about the data-generating process and subsequently leverage them to provide reliable predictions. First, we propose a novel probabilistic framework based on constrained Gaussian Processes that incorporates derivative information in the form of ordinary differential equations. Second, we propose a novel noise model for Gaussian Processes that, in conjunction with state-of-the-art moment matching and adversarial techniques, addresses the problem of parameter inference in …