作者: Bo Wahlberg , Arun Venkitaraman , Anders Hansson
DOI:
关键词:
摘要: This paper investigates the use of nonparametric kernel-regression to obtain a task- similarity aware meta-learning algorithm. Our hypothesis is that helps when available tasks are limited and may contain outlier/ dissimilar tasks. While existing approaches implicitly assume as being similar, it generally unclear how this task-similarity could be quantified used in learning. As result, most popular meta- learning do not actively similarity/dissimilarity between tasks, but rely on availability huge number for their working. contribution novel framework explicitly uses form kernels an associated We model task-specific parameters belong reproducing kernel Hilbert space where function captures across The proposed algorithm iteratively learns meta-parameter which assign descriptor every task. task descriptors then quantify through function. show our approach conceptually generalizes model-agnostic (MAML) Meta-stochastic gradient descent (Meta-SGD) approaches. Numerical experiments with regression classification outperforms these limited, even presence out- lier or supports improve performance task-limited adverse settings.