A note on activation function in multilayer feedforward learning

作者: J. Kamruzzaman , S.M. Aziz

DOI: 10.1109/IJCNN.2002.1005526

关键词:

摘要: … activation functions as the nodes approaches saturated values. In this paper, we present a new activation function lo … In this paper, we present an inverse tangent activation function for …

参考文章(10)
S.C. Ng, S.H. Leung, A. Luk, Fast Convergent Generalized Back-Propagation Algorithm with Constant Learning Rate Neural Processing Letters. ,vol. 9, pp. 13- 23 ,(1999) , 10.1023/A:1018611626332
X-H Yu, None, Acceleration of backpropagation learning using optimised learning rate and momentum Electronics Letters. ,vol. 29, pp. 1288- 1290 ,(1993) , 10.1049/EL:19930860
C. L. Blake, UCI Repository of machine learning databases www.ics.uci.edu/〜mlearn/MLRepository.html. ,(1998)
A. Van Ooyen, B. Nienhuis, Improving the convergence of the back-propagation algorithm Neural Networks. ,vol. 5, pp. 465- 471 ,(1992) , 10.1016/0893-6080(92)90008-7
B. Verma, Fast training of multilayer perceptrons IEEE Transactions on Neural Networks. ,vol. 8, pp. 1314- 1320 ,(1997) , 10.1109/72.641454
M. Riedmiller, H. Braun, A direct adaptive method for faster backpropagation learning: the RPROP algorithm IEEE International Conference on Neural Networks. ,vol. 1, pp. 586- 591 ,(1993) , 10.1109/ICNN.1993.298623
Robert A. Jacobs, Increased Rates of Convergence Through Learning Rate Adaptation Neural Networks. ,vol. 1, pp. 295- 307 ,(1987) , 10.1016/0893-6080(88)90003-2
J. Bilski, The Backpropagation learning with logarithmic transfer function soft computing. pp. 71- 76 ,(2000)