作者: Subhransu Maji , Jong-Chyi Su , Bharath Hariharan
DOI:
关键词:
摘要: We investigate the role of self-supervised learning (SSL) in context few-shot learning. Although recent research has shown benefits SSL on large unlabeled datasets, its utility small datasets is relatively unexplored. find that reduces relative error rate meta-learners by 4%-27%, even when are and only utilizing images within datasets. The improvements greater training set smaller or task more challenging. may increase with larger sets, we observe can hurt performance distributions used for meta-learning different. conduct a systematic study varying degree domain shift analyzing several multitude domains. Based this analysis present technique automatically selects from large, generic pool given dataset provides further improvements.