作者: Tengfei Shen , Dingyun Zhu
DOI: 10.1109/IJCNN.2012.6252799
关键词: Artificial neural network 、 Artificial intelligence 、 Computational complexity theory 、 Constructive 、 Feedforward neural network 、 Extension (predicate logic) 、 Topology (electrical circuits) 、 Network architecture 、 Computer science 、 Algorithm 、 Simple (abstract algebra)
摘要: Previous research has demonstrated that constructive algorithms are powerful methods for training feedforward neural networks. The CasPer algorithm is a network generates networks from simple architecture and then expands it. A_CasPer modified version of the which uses candidate pool instead single neuron being trained. This adds an extension to in terms - Layered_CasPer algorithm. hidden neurons form as layers new structure results less computational cost required. Beyond structure, other aspects same A_CasPer. benchmarked on number classification problems compared algorithms, CasCor, CasPer, A_CasPer, AT_CasPer. It shown better performance datasets have large inputs tasks. advantage over cascade style more similar topology familiar layered traditional