作者: Jianping Shi , Jiaying Liu , Xiaodi Hou , Yanghao Li , Naiyan Wang
DOI:
关键词:
摘要: Deep neural networks (DNN) have shown unprecedented success in various computer vision applications such as image classification and object detection. However, it is still a common annoyance during the training phase, that one has to prepare at least thousands of labeled images fine-tune network specific domain. Recent study (Tommasi et al. 2015) shows DNN strong dependency towards dataset, learned features cannot be easily transferred different but relevant task without fine-tuning. In this paper, we propose simple yet powerful remedy, called Adaptive Batch Normalization (AdaBN) increase generalization ability DNN. By modulating statistics all layers across network, our approach achieves deep adaptation effect for domain tasks. contrary other learning methods, method does not require additional components, parameter-free. It archives state-of-the-art performance despite its surprising simplicity. Furthermore, demonstrate complementary with existing methods. Combining AdaBN treatments may further improve model performance.