Acceleration of learning speed in neural networks by reducing weight oscillations

作者: Bin-Chul Ihm , Dong-Jo Park

DOI: 10.1109/IJCNN.1999.832637

关键词: BackpropagationTerm (time)Artificial neural networkAccelerationControl theoryArtificial intelligenceComputer science

摘要: We propose a novel fast learning algorithm in neural networks. The conventional backpropagation suffers from slow convergence due to weight oscillations at narrow valley the error surface. To overcome this difficulty we derive new gradient term by modifying original with an estimated downward direction valley. Simulation results show that proposed method reduces considerably and achieves convergence.

参考文章(1)
Shun-ichi Amari, Natural gradient works efficiently in learning Neural Computation. ,vol. 10, pp. 177- 202 ,(1998) , 10.1162/089976698300017746