作者: Anders Krogh , C. I. Thorbergsson , John A. Hertz
DOI:
关键词:
摘要: We introduce a cost function for learning in feed-forward neural networks which is an explicit of the internal representation addition to weights. The problem can then be formulated as two simple perceptrons and search representations. Back-propagation recovered limit. frequency successful solutions better this algorithm than back-propagation when weights hidden units are updated on same timescale i.e. once every step.