Complex-Valued Recurrent Neural Network with IIR Neuron Model: Training and Applications

作者: Chunguang Li , Xiaofeng Liao , Juebang Yu

DOI: 10.1007/S00034-002-0119-8

关键词:

摘要: In this paper, based on the digital filter theory and approach, a new algorithm for training complex-valued recurrent neural network, is proposed. Each neuron modeled as an infinite impulse response (IIR) filter. The network weights are updated by optimizing IIR coefficients, optimization layer-by-layer procedure well recursive least-squares method. performance of proposed demonstrated with application to complex communication channel equalization. Our approach provides way perform fast networks.

参考文章(7)
S Haykin, Adaptive Filter Theory ,(1986)
Ronald J. Williams, David Zipser, A learning algorithm for continually running fully recurrent neural networks Neural Computation. ,vol. 1, pp. 270- 280 ,(1989) , 10.1162/NECO.1989.1.2.270
G. Kechriotis, E.S. Manolakos, Training fully recurrent neural networks with complex weights IEEE Transactions on Circuits and Systems Ii: Analog and Digital Signal Processing. ,vol. 41, pp. 235- 238 ,(1994) , 10.1109/82.279210
Simon Haykin, Adaptive filter theory (2nd ed.) Prentice-Hall, Inc.. ,(1991)
G. Cybenko, Approximation by superpositions of a sigmoidal function Mathematics of Control, Signals, and Systems. ,vol. 2, pp. 303- 314 ,(1989) , 10.1007/BF02551274
T.W.S. Chow, Siu-Yeung Cho, An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method IEEE Transactions on Circuits and Systems I-regular Papers. ,vol. 44, pp. 1082- 1086 ,(1997) , 10.1109/81.641774
Gou-Jen Wang, Chih-Cheng Chen, A fast multilayer neural-network training algorithm based on the layer-by-layer optimizing procedures IEEE Transactions on Neural Networks. ,vol. 7, pp. 768- 775 ,(1996) , 10.1109/72.501734