DOI: 10.1162/NECO.1992.4.1.131
关键词: Net (mathematics) 、 Computer science 、 Control (management) 、 Temporal information 、 Class (computer programming) 、 Storage efficiency 、 Sequence learning 、 Machine learning 、 Temporary variable 、 Artificial intelligence 、 Feed forward
摘要: Previous algorithms for supervised sequence learning are based on dynamic recurrent networks. This paper describes an alternative class of gradient-based systems consisting two feedforward nets that learn to deal with temporal sequences using fast weights: The first net learns produce context-dependent weight changes the second whose weights may vary very quickly. method offers potential STM storage efficiency: A single (instead a full-fledged unit) be sufficient storing information. Various methods derived. Two experiments unknown time delays illustrate approach. One experiment shows how system can used adaptive temporary variable binding.