作者: Matthew Rabinowitz
DOI:
关键词: Backpropagation 、 Finite impulse response 、 Infinite impulse response 、 Nonlinear system 、 Method of steepest descent 、 Artificial neural network 、 Filter (signal processing) 、 Adaptive filter 、 Computer science 、 Control theory
摘要: Described herein is a method and system for training nonlinear adaptive filters (or neural networks) which have embedded memory. Such memory can arise in multi-layer finite impulse response (FIR) architecture, or an infinite (IIR) architecture. We focus on filter architectures with separate linear dynamic components static components. be structured so as to restrict their degrees of computational freedom based priori knowledge about the operation emulated. The detailed FIR architecture consists together generalized single layer subnets. For IIR case, we extend methodology general uses feedback. these architectures, describe how one apply optimization techniques make updates closer Newton direction than those steepest descent method, such backpropagation. detail novel modified Gauss-Newton technique, learning rate determine both magnitude update steps. wide range filtering applications, new algorithm converges faster smaller value cost steepest-descent methods backpropagation-through-time, standard quasi-Newton methods. modeling inverse tracking 5, well amplifier 6.