作者: Chi-Chung Cheung , Sin-Chun Ng , Andrew K Lui
DOI: 10.1109/IJCNN.2012.6252546
关键词:
摘要: Backpropagation (BP) algorithm is the most popular supervised learning that extensively applied in training feed-forward neural networks. Many BP modifications have been proposed to increase convergence rate of standard algorithm, and Quickprop one fast algorithms. The very fast; however, it easily trapped into a local minimum thus cannot converge global minimum. This paper proposes new modified from Quickprop. By addressing drawbacks has systematic approach improve capability Our performance investigation shows always converges with faster compared improvement especially large. In problem (application), increased 4% 100%.