作者: Zhijian Huang , Min Pei , Erik Goodman , Yong Huang , Gaoping Li
DOI: 10.1007/3-540-45110-2_108
关键词: Dimensionality reduction 、 Artificial intelligence 、 k-nearest neighbors algorithm 、 Principal component analysis 、 Linear classifier 、 Feature vector 、 Feature selection 、 Classifier (UML) 、 Linear discriminant analysis 、 Data mining 、 Mathematics 、 Pattern recognition
摘要: When using a Genetic Algorithm (GA) to optimize the feature space of pattern classification problems, performance improvement is not only determined by data set used, but also depends on classifier. This work compares improvements achieved GA-optimized transformations several simple classifiers. Some traditional transformation techniques, such as Principal Components Analysis (PCA) and Linear Discriminant (LDA) are tested see their effects GA optimization. The results based some real-world five benchmark sets from UCI repository show that after in reverse ratio with original rate if classifier used alone. It shown performing PCA LDA prior optimization improved final result.