作者: Vipin Srivastava , Suchitra Sampath , David J. Parker
DOI: 10.1371/JOURNAL.PONE.0105619
关键词:
摘要: Connectionist models of memory storage have been studied for many years, and aim to provide insight into potential mechanisms by the brain. A problem faced these systems is that as number items be stored increases across a finite set neurons/synapses, cumulative changes in synaptic weight eventually lead sudden dramatic loss information (catastrophic interference, CI) previous are effectively lost. This effect does not occur brain, where gradual. Various attempts made overcome effects CI, but generally use schemes impose restrictions on system or its inputs rather than allowing intrinsically cope with increasing demands. We show here catastrophic interference occurs result among patterns when exceeds critical limit. However, Gram-Schmidt orthogonalization combined Hebb-Hopfield model, model attains ability eliminate CI. approach differs from orthogonalisation used connectionist networks which essentially reflect sparse coding input. Here CI avoided network fixed size without setting limits rate encoded, separating encoding retrieval, thus offering advantage associations between incoming patterns. PACS Nos.: 87.10.+e, 87.18.Bb, 87.18.Sn, 87.19.La