Diminishing the number of nodes in multi-layered neural networks

作者: P. Nocera , R. Quelavoine

DOI: 10.1109/ICNN.1994.374981

关键词: Feedforward neural networkMultilayer perceptronRecurrent neural networkEcho state networkTime delay neural networkArtificial intelligenceDeep learningComputer scienceNode (networking)Probabilistic neural network

摘要: We propose in this paper two ways for diminishing the size of a multilayered neural network trained to recognise French vowels. The first deals with hidden layers: study variation outputs each node gives us information on its very discrimination power and then allows reduce network. second involves input nodes: by examination connecting weights between nodes following layer, we can determinate which features are actually relevant our classification problem, eliminate useless ones. Through problem recognising vowel /a/, show that obtain reduced structure still learn. >

参考文章(1)
L. Bochereau, P. Bourgine, A generalist-specialist paradigm for multilayer neural networks 1990 IJCNN International Joint Conference on Neural Networks. pp. 87- 91 ,(1990) , 10.1109/IJCNN.1990.137828