Suitable MLP Network Activation Functions for Breast Cancer and Thyroid Disease Detection

作者: IS Isa , Z Saad , S Omar , MK Osman , KA Ahmad

DOI: 10.1109/CIMSIM.2010.93

关键词:

摘要: This paper compared various MLP activation functions for classification problems. The most well-known (Artificial Neural Network) ANN architecture is the Multilayer Perceptron (MLP) network which widely used solving problems related to data classifications. Selection of in plays an essential role on performance. A lot studies have been conducted by reseachers investigate special function solve different kind Therefore, this intends networks terms accuracy performances. under investigation are sigmoid, hyperbolic tangent, neuronal, logarithmic, sinusoidal and exponential. Medical diagnosis from two case studies, thyroid disease breast cancer classification, test performance network. trained using Back Propagation learning algorithm. calculated based percentage correct classificition. results show that tangent had capability produce highest classifying data. Meanwhile, neuronal suitable performed

参考文章(18)
Amir Abolfazl Suratgar, Abbas Hoseinabadi, Mohammad Bagher Tavakoli, Modified Levenberg-Marquardt Method for Neural Networks Training World Academy of Science, Engineering and Technology, International Journal of Computer, Electrical, Automation, Control and Information Engineering. ,vol. 1, pp. 1745- 1747 ,(2007)
T. Kim, T. Adali, Complex backpropagation neural network using elementary transcendental activation functions international conference on acoustics, speech, and signal processing. ,vol. 2, pp. 1281- 1284 ,(2001) , 10.1109/ICASSP.2001.941159
S. Xu, M. Zhang, A novel adaptive activation function international joint conference on neural network. ,vol. 4, pp. 2779- 2782 ,(2001) , 10.1109/IJCNN.2001.938813
K.-W. Wong, C.-S. Leung, S.-J. Chang, Use of periodic and monotonic activation functions in multilayer feedforward neural networks trained by extended Kalman filter algorithm IEE Proceedings - Vision, Image, and Signal Processing. ,vol. 149, pp. 217- 224 ,(2002) , 10.1049/IP-VIS:20020515
F. Piekniewski, L. Rybicki, Visual comparison of performance for different activation functions in MLP networks international joint conference on neural network. ,vol. 4, pp. 2947- 2952 ,(2004) , 10.1109/IJCNN.2004.1381133
X. Ying, Role of activation function on hidden units for sample recording in three-layer neural networks international joint conference on neural network. pp. 69- 74 ,(1990) , 10.1109/IJCNN.1990.137548
L. Ma, K. Khorasani, Constructive feedforward neural networks using Hermite polynomial activation functions IEEE Transactions on Neural Networks. ,vol. 16, pp. 821- 833 ,(2005) , 10.1109/TNN.2005.851786
P. Campolucci, F. Capperelli, S. Guarnieri, F. Piazza, A. Uncini, Neural networks with adaptive spline activation function Proceedings of 8th Mediterranean Electrotechnical Conference on Industrial Applications in Power Systems, Computer Science and Telecommunications (MELECON 96). ,vol. 3, pp. 1442- 1445 ,(1996) , 10.1109/MELCON.1996.551220
J. Kamruzzaman, S.M. Aziz, A note on activation function in multilayer feedforward learning international joint conference on neural network. ,vol. 1, pp. 519- 523 ,(2002) , 10.1109/IJCNN.2002.1005526
Michal Rosen-Zvi, Michael Biehl, Ido Kanter, Learnability of periodic activation functions: General results Physical Review E. ,vol. 58, pp. 3606- 3609 ,(1998) , 10.1103/PHYSREVE.58.3606