Neural networks with periodic and monotonic activation functions: a comparative study in classification problems

作者: Enrique Romero Merino , Josep Maria Sopena , René Alquézar Mancho , Joan L Moliner

DOI:

关键词:

摘要: This article discusses a number of reasons why the use of non-monotonic functions as activation functions can lead to a marked improvement in the performance of a neural network. Using a wide range of benchmarks we show that a multilayer feed-forward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functions - as much as 150-500 times faster - when both types are trained with backpropagation. Learning speed also compares favorably with speeds reported using modified versions of the backpropagation algorithm. In addition, computational and generalization capacity increases.

参考文章(0)