作者: Lei Xu , Erkki Oja , Ching Y. Suen
DOI: 10.1016/0893-6080(92)90006-5
关键词: Noise 、 Artificial neural network 、 Curve fitting 、 Learning rule 、 Hebbian theory 、 Minor (linear algebra) 、 Outlier 、 Mathematics 、 Calculus 、 Principal component analysis 、 Algorithm
摘要: Abstract A linear neural unit with a modified anti-Hebbian learning rule is shown to be able optimally fit curves, surfaces, and hypersurfaces by adaptively extracting the minor component (i.e., counterpart of principal component) input data set. The analyzed mathematically. results computer simulations are given illustrate that this fitting method considerably outperform commonly used least square in resisting both normal noise outlier.