作者: Charles X. Ling , Tanner A. Bohn
DOI:
关键词:
摘要: Humans can learn a variety of concepts and skills incrementally over the course their lives while exhibiting an array desirable properties, such as non-forgetting, concept rehearsal, forward transfer backward knowledge, few-shot learning, selective forgetting. Previous approaches to lifelong machine learning only demonstrate subsets these often by combining multiple complex mechanisms. In this Perspective, we propose powerful unified framework that all properties utilizing small number weight consolidation parameters in deep neural networks. addition, are able draw many parallels between behaviours mechanisms our proposed those surrounding human memory loss or sleep deprivation. This Perspective serves conduit for two-way inspiration further understand machines humans.