Transfer functions: hidden possibilities for better neural networks.

作者: Wlodzislaw Duch , Norbert Jankowski

DOI:

关键词:

摘要: Sigmoidal or radial transfer functions do not guarantee the best gen- eralization nor fast learning of neural networks. Families parameterized trans- fer provide flexible decision borders. Networks based on such should be small and accurate. Several possibilities using different types in models are discussed, including enhance- ment input features, selection from a fixed pool, optimization parameters general type functions, regularization large networks with heterogeneous nodes constructive approaches. A new taxonomy is proposed, allowing for derivation known by additive multiplicative combination activation output functions.

参考文章(21)
Krzysztof Grabczewski, Wodzisaw Duch, Extraction of logical rules from backpropagation networks ,(1998)
Norbert Jankowski, Survey of Neural Transfer Functions ,(1999)
Włodzisław Ducha, Geerd H.F. Diercksenb, Rafał Adamczaka, Distance-based Multilayer Perceptrons ,(2000)
Włodzisław Duch, Rafał Adamczak, Geerd H.F. Diercksen, Neural Networks in Non-Euclidean Spaces Neural Processing Letters. ,vol. 10, pp. 201- 210 ,(1999) , 10.1023/A:1018728407584
Merten Joost, Wolfram Schiffmann, Randolf Werner, Comparison of optimized backpropagation algorithms. the european symposium on artificial neural networks. ,(1993)
Christopher M. Bishop, Neural networks for pattern recognition ,(1995)
Geerd H. F. Diercksen, Wlodzislaw Duch, Rafal Adamczak, Constructive density estimation network based on several different separable transfer functions. the european symposium on artificial neural networks. pp. 107- 112 ,(2001)
M.F. Redondo, C.H. Espinosa, Neural networks input selection by using the training set international joint conference on neural network. ,vol. 2, pp. 1189- 1194 ,(1999) , 10.1109/IJCNN.1999.831128