作者: Qi Wang , Yingjie Tian , Dalian Liu
DOI: 10.1109/ACCESS.2019.2940983
关键词:
摘要: Support vector machines (SVMs), powerful learning methods, have been popular among machine researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight loss each class. Recently, Focal has widely used for deep address imbalanced datasets. The significant effectiveness attracts attention in many fields, such as object detection, semantic segmentation. Inspired by loss, we reconstructed scaling factor called FH Loss, which not only deals problems but also preserve distinctive property loss. Owing difficulty trade-off between positive negative accuracy classification, pays more minority misclassified instances improve class, further reduce influence imbalance. In addition, solving propose an improved model modified Adaptive FH-SVM. algorithm solves optimization problem iteratively adaptively updates instance. Experimental results 31 binary datasets demonstrate our proposed method.