作者: Kisang Kim , Hyung-Il Choi , Kyoungsu Oh
DOI: 10.1186/S13640-017-0189-Y
关键词:
摘要: The Adaboost (Freund and Schapire, Eur. Conf. Comput. Learn. Theory 23–37, 1995) chooses a good set of weak classifiers in rounds. On each round, it the optimal classifier (optimal feature its threshold value) by minimizing weighted error classification. It also reweights training data so that next round would focus on are difficult to classify. When determining value, process classification is employed. involved usually performs hard decision (Viola Jones, Rapid object detection using boosted cascade simple features, 2001; Joo et al., Sci. World J 2014: 1–17, 2014; Friedman Ann. Stat 28:337–407, 2000). In this paper, we extend soft fuzzy decision. We believe extension could allow some flexibility algorithm as well performance especially when size not large enough. algorithm, general, assigns same weight datum first boosting 1995). propose assign different initial weights based statistical properties features. experimental results, show proposed method yields higher performances compared other ones.