作者: Tautvydas Cibas , Françroise Fogelman Soulié , Patrick Gallinari , Sarunas Raudys
DOI: 10.1007/978-1-4471-2097-1_171
关键词: Classifier (UML) 、 Regression 、 Feature selection 、 Artificial neural network 、 Algorithm 、 Heuristics 、 Computer science 、 Regularization (mathematics)
摘要: Neural Networks -NN- have been used in a large variety of real-world applications. In those, one could measure potentially number N variables Xi; probably not all Xi are equally informative: if select n« “best” Xi, then reduce the amount data to gather and process; hence costs. Variable selection is thus an important issue Pattern Recognition Regression. It also complex problem; needs criterion value subset that will course depend on predictor or classifier further used. Conventional variable techniques based upon statistical heuristics tools [Fukunaga, 90]: major difficulty comes from intrinsic combinatorics problem. this paper we show how use NNs for with evaluation usefulness. Various methods proposed assess weight (e.g. saliency [Le Cun et al. 90] Optimal Brain-Damage -OBD- procedure): along similar ideas, derive method, called Cell Damage -OCD-, which evaluates usefulness input Multi-Layer Network prunes least useful. achieved during training classifier, ensuring selected set matches complexity. viewed here as extension pruning. One can regularization approach selection, discuss elsewhere [Cibas al., 94]. We illustrate our method two relatively small problems: prediction synthetic time series classification waveforms [Breiman 84], representative hard problems.