作者: Amir Hesam Salavati , Amin Shokrollahi , K. Raj Kumar
DOI:
关键词: Finite set 、 Algorithm 、 Theoretical computer science 、 Redundancy (information theory) 、 Recall 、 Memorization 、 Iterative method 、 Computer science 、 Set (abstract data type) 、 Covariance matrix 、 Content-addressable memory 、 Iterative learning control
摘要: We consider the problem of neural association for a network non-binary neurons. Here, task is to first memorize set patterns using neurons whose states assume values from finite number integer levels. Later, same should be able recall previously memorized their noisy versions. Prior work in this area storing purely random patterns, and have shown that pattern retrieval capacities (maximum can memorized) scale only linearly with network. In our formulation problem, we concentrate on exploiting redundancy internal structure order improve capacity. Our result shows if given suitable linear-algebraic structure, i.e. comprise sub-space all possible then capacity fact exponential terms The second extends previous finding cases where weak minor components, smallest eigenvalues correlation matrix tend toward zero. will use these components (or basis vectors null space) both increase error correction capabilities. An iterative algorithm proposed learning phase, two simple update algorithms are presented phase. Using analytical results simulations, show methods tolerate fair amount errors input while being an exponentially large patterns.