作者: Giulio Bottegal , Aleksandr Y. Aravkin , Gianluigi Pillonetto , Håkan Hjalmarsson
DOI:
关键词: Random variable 、 Outlier 、 Computer science 、 System identification 、 Kernel (linear algebra) 、 Probability density function 、 Laplace operator 、 Maximum a posteriori estimation 、 Optimization problem 、 Kernel (statistics) 、 Algorithm 、 Estimator 、 Kriging 、 Hyperparameter
摘要: Recent developments in system identification have brought attention to regularized kernel-based methods. This type of approach has been proven compare favorably with classic parametric However, current formulations are not robust respect outliers. In this paper, we introduce a novel method robustify To end, model the output measurement noise using random variables heavy-tailed probability density functions (pdfs), focusing on Laplacian and Student's t distributions. Exploiting representation these pdfs as scale mixtures Gaussians, cast our problem into Gaussian process regression framework, which requires estimating number hyperparameters data size order. overcome difficulty, design new maximum posteriori (MAP) estimator hyperparameters, solve related optimization iterative scheme based Expectation-Maximization (EM) method. presence outliers, tests simulated real show substantial performance improvement compared currently used methods for linear identification.