A Cost Function for Internal Representations

作者: Anders Krogh , C. I. Thorbergsson , John A. Hertz

DOI:

关键词:

摘要: We introduce a cost function for learning in feed-forward neural networks which is an explicit of the internal representation addition to weights. The problem can then be formulated as two simple perceptrons and search representations. Back-propagation recovered limit. frequency successful solutions better this algorithm than back-propagation when weights hidden units are updated on same timescale i.e. once every step.

参考文章(5)
Sara A. Solla, Esther Levin, Michael Fleisher, Accelerated learning in layered neural networks Complex Systems. ,vol. 2, pp. 625- 639 ,(1988)
David E. Rumelhart, James L. McClelland, , Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations Computational Models of Cognition and Perception. ,(1986) , 10.7551/MITPRESS/5236.001.0001
Rena Bartos, Elizabeth C. Hirschman, The Moving Target ,(1982)
Ronny Meir, Eytan Domany, Tal Grossman, Learning by choice of internal representations Complex Systems. ,vol. 2, pp. 555- 575 ,(1988)