作者: Zhitong Zhao , Jiantao Zhou , Haifeng Xing
DOI: 10.1007/978-981-15-1377-0_55
关键词: Pattern recognition 、 Artificial intelligence 、 Nonlinear dimensionality reduction 、 Support vector machine 、 Linear discriminant analysis 、 Curse of dimensionality 、 k-nearest neighbors algorithm 、 Feature (machine learning) 、 Kernel principal component analysis 、 Dimensionality reduction 、 Computer science
摘要: In the analysis and processing of image recognition, extracting useful valuable data from original dataset has become a problem. Since to be processed often presents high dimensional nonlinear feature, reasonable dimensionality reduction is an necessary method for improving accuracy analysis. One methods Kernel Principal Component Analysis (KPCA) certain advantages in dealing with data, but it also defects when facing which owe highly complex relationship. The other linear Linear Discriminant (LDA) supervisory characteristics, can reduce However only handle data. So we propose hybrid combination above two called KPCA-LDA. By new obtained by step beneficial next classification. We combine KPCA-LDA Back Propagation Neural Network (BPNN) achieve classification handwritten numbers. experimental results show that proposed KPCA-LDA-BPNN model reach 98.67%, about 3%-5% higher than using K Nearest Neighbor (KNN) Support Vector Machine (SVM).