作者: Kaizhu Huang , Irwin King , Michael R. Lyu
关键词: Machine learning 、 Naive Bayes classifier 、 Bounded function 、 Statistical model 、 Mixture model 、 Bayesian network 、 Computer science 、 Algorithm 、 Combinatorial optimization 、 Artificial intelligence 、 Conditional independence 、 Classifier (UML) 、 Time complexity
摘要: The Semi-Naive Bayesian network (SNB) classifier, a probabilistic model with an assumption of conditional independence among the combined attributes, shows good performance in classification tasks. However, traditional SNBs can only combine two attributes into attribute. This inflexibility together its strong independency may generate inaccurate distributions for some datasets and thus greatly restrict SNBs. In this paper we develop Bounded (B-SNB) based on direct combinatorial optimization. Our join any number within given bound maintains polynomial time cost at same time. improvement expands expressive ability SNB provide potentials to increase accuracy Further, aiming relax SNB, then propose algorithm extend B-SNB finite mixture structure, named Mixture (MBSNB). We give theoretical derivations, outline algorithm, analysis set experiments demonstrate usefulness MBSNB novel better comparison than other types classifiers paper.