作者:
关键词: Least trimmed squares 、 Regression diagnostic 、 Eigendecomposition of a matrix 、 Algorithm 、 Basis (linear algebra) 、 Gaussian noise 、 Mathematical optimization 、 Total least squares 、 Mathematics 、 Ordinary least squares 、 Outlier
摘要: Least squares minimization is by nature global and, hence, vulnerable to distortion outliers. We present a novel technique reject outliers from an m -dimensional data set when the underlying model hyperplane (a line in two dimensions, plane three dimensions). The has sound statistical basis and assumes that Gaussian noise corrupts otherwise valid data. majority of alternative techniques available literature focus on ordinary least , where single variable designated be dependent all others - often unsuitable practice. method presented here operates more general framework orthogonal regression uses new diagnostic based eigendecomposition. It subsumes traditional residuals scheme using matrix perturbation theory, provides error for solution once contaminants have been removed.