作者: L.N. De Castro , L.A. Ramirez , F. Gomide , F.J. Von Zuben
DOI: 10.1109/IJCNN.1999.830851
关键词: Artificial neural network 、 Conjugate gradient method 、 Feedforward neural network 、 Supervised learning 、 Gradient descent 、 Function approximation 、 Artificial intelligence 、 Computer science 、 Transfer function 、 Backpropagation 、 Algorithm
摘要: Tuning procedures for activation functions significantly increases the flexibility and nonlinear approximation capability of feedforward neural networks in supervised learning tasks. As a consequence, process presents better performance, with final state network being kept away from undesired saturation regions. Based on hybrid architecture combining gradient strategy fuzzy decision model, an auto-tuning algorithm is derived to adjust additional parameters associated functions. The other conventional parameters, connection weights between layers, are adjusted using powerful second-order approach based conjugate algorithm. To demonstrate performance proposed method we compare this technique standard solely descent method. three applied several artificial real world benchmarks.