作者: Yuan-chin Ivan Chang
DOI:
关键词:
摘要: The support vector machine classifier is a linear maximum margin classifier. It performs very well in many classification applications. Although, it could be extended to nonlinear cases by exploiting the idea of kernel, might still suffer from heterogeneity training examples. Since there are few theories literature guide us on how choose kernel functions, selection usually based try-and-error manner. When set imbalanced, data not separable feature space defined chosen kernel. In this paper, we propose hybrid method integrating “small” classifiers logistic regression models. By appropriately partitioning set, ensemble can improve performance SVM trained with whole examples at time. With method, only avoid difficulty heterogeneity, but also have probability outputs for all Moreover, less ambiguous than combined voting schemes. From our simulation studies and some empirical results, find that these kinds robust following sense: (1) improves (prediction accuracy) when kind examples; (2) least as good original classifier, actually no presented We apply multi-class problems replacing binary models polychotomous model constructed individual