Nearest prototype classification of noisy data

作者: Fernando Fernández , Pedro Isasi

DOI: 10.1007/S10462-009-9116-7

关键词: Data miningWell-definedPattern recognitionNoise levelArtificial intelligenceClass (biology)Evolutionary learningsortNoisy dataComputer scienceStatistical classification

摘要: Nearest prototype approaches offer a common way to design classifiers. However, when data is noisy, the success of this sort classifiers depends on some parameters that designer needs tune, as number prototypes. In work, we have made study ENPC technique, based nearest approach, in noisy datasets. Previous experimentation algorithm had shown it does not require any parameter tuning obtain good solutions problems where class limits are well defined, and noisy. show able with high classification even A comparison optimal (hand made) other different algorithms demonstrates performance accuracy prototypes noise level increases. We performed experiments four datasets, each them characteristics.

参考文章(28)
T. Kohonen, Self-organization and associative memory: 3rd edition Springer-Verlag New York, Inc.. ,(1989)
Fernando Fernandez, Pedro Isasi, Automatic finding of good classifiers following a biologically inspired metaphor Computing and Informatics \/ Computers and Artificial Intelligence. ,vol. 21, pp. 205- 220 ,(2002)
Sergio Bermejo, Joan Cabestany, A Batch Learning Vector Quantization Algorithm for Nearest Neighbour Classification Neural Processing Letters. ,vol. 11, pp. 173- 184 ,(2000) , 10.1023/A:1009634824627
Ian H. Witten, Eibe Frank, Generating Accurate Rule Sets Without Global Optimization international conference on machine learning. pp. 144- 151 ,(1998)
Mark A. Hall, Ian H. Witten, Eibe Frank, Data Mining: Practical Machine Learning Tools and Techniques ,(1999)
Kenji Kira, Larry A. Rendell, A Practical Approach to Feature Selection international conference on machine learning. pp. 249- 256 ,(1992) , 10.1016/B978-1-55860-247-2.50037-1
Richard A Olshen, Charles J Stone, Leo Breiman, Jerome H Friedman, Classification and regression trees ,(1983)
Dietrich Wettschereck, David W. Aha, Takao Mohri, A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms Artificial Intelligence Review. ,vol. 11, pp. 273- 314 ,(1997) , 10.1023/A:1006593614256
George H. John, Pat Langley, Estimating continuous distributions in Bayesian classifiers uncertainty in artificial intelligence. pp. 338- 345 ,(1995)