作者: Shuqiong Wu , Hiroshi Nagahashi
关键词: Parameterized complexity 、 Object detection 、 Machine learning 、 AdaBoost 、 LogitBoost 、 Computer science 、 LPBoost 、 Data classification 、 BrownBoost 、 Generalization error 、 Artificial intelligence 、 Pattern recognition 、 Boosting (machine learning)
摘要: As a machine learning algorithm, AdaBoost has obtained considerable success in data classification and object detection. Later its generalized version called Real was proposed by Schapire Singer. increases weights for misclassified samples decreases correctly classified every iteration. This kind of weight adjustment focuses on the with large tries to make them future runs. However, it may lead misclassification some other that have been previous If we can curb this during boosting process, faster training be achieved. Based assumption, propose Parameterized which parameter is devised penalize already classified. Then analyse positive margins are more than AdaBoost. Experimental results show our approach achieves convergence error also improves generalization degree when compared