摘要: Using multiple classifiers for increasing learning accuracy is an active research area. In this paper we present two related methods merging classifiers. The first method, Cascade Generalization, couples loosely. It belongs to the family of stacking algorithms. basic idea Generalization use sequentially set classifiers, at each step performing extension original data by insertion new attributes. attributes are derived from probability class distribution given a base classifier. This constructive extends representational language high level relaxing their bias. second method exploits tight coupling applying locally. At iteration divide and conquer algorithm, reconstruction instance space occurs addition Each attribute represents that example We have implemented three Local Algorithms. merges linear discriminant with decision tree, naive Bayes third tree. All algorithms show increase performance, when compared corresponding single models. also outperforms other combining like Stacked competes well against Boosting statistically significant confidence levels.