作者: S. Geva , J. Sitte
DOI: 10.1109/IJCNN.1991.170732
关键词: Algorithm 、 Artificial intelligence 、 Function approximation 、 Pattern recognition 、 Mathematics 、 Artificial neural network 、 Sigmoid function 、 Activation function 、 Radial basis function network 、 Multilayer perceptron 、 Backpropagation 、 Feedforward neural network
摘要: A three-layer neural network, having a hidden layer of neurons with an exponential transfer function, capable performing function approximation more accurately, and economically, than conventional multilayer perceptron (MLP) sigmoidal is described. The network was trained by variation the standard backpropagation gradient-descent technique. results difficult problem, where MLP similar size simply fails to perform within reasonable constraints on training time, are shown graphically. >