作者: Heshan Wang , Xuefeng Yan
DOI: 10.1016/J.NEUCOM.2014.05.024
关键词: Recurrent neural network 、 System identification 、 Pruning (decision trees) 、 Computer science 、 Generalization 、 Benchmark (computing) 、 Reservoir computing 、 Echo state network 、 Algorithm 、 Cognitive neuroscience 、 Artificial intelligence 、 Computer Science Applications
摘要: Abstract Reservoir Computing (RC) is an effective approach to design and train recurrent neural networks, which successfully widely applied in real-valued time series modeling tasks. However, RC has been criticized for not being principled enough, namely the reservoir unlikely be optimal because connectivity weight structure are created randomly. A new Simple Cycle Network (SCRN) with deterministically constructed can yield performance competitive standard Echo State (ESN). In order determine proper size of improve generalization ability SCRN, a Sensitive Iterated Pruning Algorithm (SIPA), larger than necessary employed firstly then its reduced by pruning out least sensitive internal units, proposed optimize weights SCRN. system identification two time-series benchmark tasks demonstrate feasibility superiority SIPA. The results show that SIPA method significantly outperforms Least Angle Regression (LAR) able Besides, well known characterizations, i.e. pseudo-Lyapunov exponent dynamics Memory Capacity, impact on characterizations investigated.