Consolidation using context-sensitive multiple task learning

作者: Ben Fowler , Daniel L. Silver

DOI: 10.1007/978-3-642-21043-3_16

关键词:

摘要: Machine lifelong learning (ML3) is concerned with machines capable of and retaining knowledge over time, exploiting this to assist new learning. An ML3 system must accurately retain prior tasks while consolidating in tasks, overcoming the stability-plasticity problem. A presented using a context-sensitive multiple task (csMTL) neural network. csMTL uses single output additional context inputs for associating examples tasks. csMTL-based analyzed empirically synthetic real domains. The experiments focus on effective retention consolidation both functional representational transfer. results indicate that combining two methods transfer serves best knowledge, but at cost less consolidation.

参考文章(12)
Daniel L. Silver, Robert E. Mercer, The Task Rehearsal Method of Life-Long Learning: Overcoming Impoverished Data Lecture Notes in Computer Science. pp. 90- 101 ,(2002) , 10.1007/3-540-47922-8_8
Daniel L. Silver, Ryan Poirier, Sequential Consolidation of Learned Task Knowledge Lecture Notes in Computer Science. pp. 217- 232 ,(2004) , 10.1007/978-3-540-24840-8_16
Daniel L. Silver, Ryan Poirier, Duane Currie, Inductive transfer with context-sensitive neural networks Machine Learning. ,vol. 73, pp. 313- 336 ,(2008) , 10.1007/S10994-008-5088-0
ANTHONY ROBINS, Catastrophic forgetting, rehearsal and pseudorehearsal Connection Science. ,vol. 7, pp. 123- 146 ,(1995) , 10.1080/09540099550039318
Jonathan Baxter, Learning Model Bias neural information processing systems. ,vol. 8, pp. 169- 175 ,(1995)
Sebastian Thrun, Is Learning The n-th Thing Any Easier Than Learning The First? neural information processing systems. ,vol. 8, pp. 640- 646 ,(1995)