DOI: 10.1109/IJCNN.2010.5596589
关键词:
摘要: Neural associative networks are a promising computational paradigm, both for modeling neural circuits of the brain and implementing Hebbian cell assemblies in parallel VLSI or nanoscale hardware. Previous works have extensively investigated synaptic learning linear models Hopfield-type simple non-linear Steinbuch/Willshaw-type. For example, optimized Hopfield n neurons can memorize about n2/k size k (or associations between them) corresponding to capacity 0.72 bits per real-valued synapse. Although employing much simpler synapses better suited efficient hardware implementations, Willshaw still store up 0.69 binary However, number is limited n2/k2 which becomes comparable nets only extremely small k. Here I present zip being an improved method that combines advantages previous models. Zip have, factor 2/π ≈ 0.64, same high storage as networks. Moreover, low-entropy (e.g., if most silent), be compressed storing 1 bit computer or, pruning, log Similar true generalized net model discrete with arbitrary states.