作者: Gursel Serpen , Yifeng Xu
DOI: 10.1007/S00521-003-0365-0
关键词: Feedforward neural network 、 Backpropagation 、 Recurrent neural network 、 Artificial neural network 、 Conjugate gradient method 、 Machine learning 、 Computational complexity theory 、 Artificial intelligence 、 Time delay neural network 、 Network dynamics 、 Computer science 、 Feed forward
摘要: This paper explores feasibility of employing the non-recurrent backpropagation training algorithm for a recurrent neural network, Simultaneous Recurrent Neural static optimisation. A simplifying observation that maps network dynamics, which is configured to operate in relaxation mode as optimizer, feedforward dynamics leveraged facilitate application such standard and its variants. simulation study aims assess feasibility, optimizing potential, computational efficiency with conducted. comparative complexity analysis between trained same performed. Simulation results demonstrate it feasible apply train network. The optimality fails any advantage on behalf versus optimisation problem considered. However, considerable future potential yet be explored exists given computationally efficient versions algorithm, namely quasi-Newton conjugate gradient descent among others, are also applicable proposed this paper.