作者: Peter Tino
DOI:
关键词: Phase transition 、 State (functional analysis) 、 Computer science 、 Exponential growth 、 Series (mathematics) 、 Free parameter 、 Operator (computer programming) 、 Dynamical systems theory 、 Kernel (linear algebra) 、 Kernel (statistics) 、 Feature vector 、 State space 、 Feature (computer vision) 、 Topology 、 Kernel method
摘要: Parameterized state space models in the form of recurrent networks are often used machine learning to learn from data streams exhibiting temporal dependencies. To break black box nature such it is important understand dynamical features input driving time series that formed space. We propose a framework for rigorous analysis representations vanishing memory as echo (ESN). In particular, we consider feature and readout mapping kernel operating show that: (1) The usual ESN strategy randomly generating input-to-state, well coupling leads shallow representations, corresponding cross-correlation operator with fast exponentially decaying coefficients; (2) Imposing symmetry on dynamic yields constrained matching straightforward motifs or highest frequency; (3) Simple cycle high-dimensional reservoir topology specified only through two free parameters can implement deep kernels rich variety motifs. quantify richness imposed by demonstrate associated topology, undergoes phase transition close edge stability.