作者: Lars Büsing , Benjamin Schrauwen , Robert Legenstein
DOI: 10.1162/NECO.2009.01-09-947
关键词:
摘要: Reservoir computing (RC) systems are powerful models for online computations on input sequences. They consist of a memoryless readout neuron that is trained top randomly connected recurrent neural network. RC commonly used in two flavors: with analog or binary (spiking) neurons the circuits. Previous work indicated fundamental difference behavior these implementations idea. The performance an system built from seems to depend strongly network connectivity structure. In networks neurons, such clear dependency has not been observed. this letter, we address apparent dichotomy by investigating influence (parameterized in-degree) family interpolates between and networks. Our analyses based novel estimation Lyapunov exponent dynamics help branching process theory, rank measures estimate kernel quality generalization capabilities networks, mean field predictor computational performance. These reveal phase transition ordered chaotic circuits qualitatively differs one circuits, leading differences integration information over short long timescales. This explains decreased observed densely connected. also bound memory function neurons.