作者: S. Ganguli , D. Huh , H. Sompolinsky
关键词: Information theory 、 Feed forward 、 Fisher information 、 Dynamical systems theory 、 Topology 、 Measure (mathematics) 、 Noise (signal processing) 、 State (computer science) 、 Limit (mathematics) 、 Computer science
摘要: To perform nontrivial, real-time computations on a sensory input stream, biological systems must retain short-term memory trace of their recent inputs. It has been proposed that generic high-dimensional dynamical could for past inputs in current state. This raises important questions about the fundamental limits such traces and properties required to achieve these limits. We address issues by applying Fisher information theory driven time-dependent signals corrupted noise. introduce Memory Curve (FMC) as measure signal-to-noise ratio (SNR) embedded state relative SNR. The integrated FMC indicates total capacity. apply this linear neuronal networks show capacity with normal connectivity matrices is exactly 1 any network N neurons is, at most, N. A nonnormal achieving bound subject stringent design constraints: have hidden feedforward architecture superlinearly amplifies its time order N, optimally match architecture. saturating nonlinearities further limited, cannot exceed square root limit can be realized structures divergent fan out distributes signal across neurons, thereby avoiding saturation. illustrate generality showing fluid sustained transient amplification due convective instability or onset turbulence.