作者: K. Fukumizu
DOI: 10.1016/S1383-8121(01)80020-8
关键词:
摘要: Publisher Summary This chapter discusses the geometric structure of multilayer networks and dynamics learning caused by their structure. A neural network has several layers, including input output layer, connections between neighboring layers. It realizes a function from vector to vector—for example, perceptrons radial basis functions. The explains that statistical formulation enables introduction Riemannian metric on model. is essentially an intrinsic quantity manifold, which defined in space all probabilities. natural gradient direction gives steepest descent for change small, fixed length measured metric. In introducing gradient, emphasizes viewpoint parameter not necessarily Euclidean but more can be introduced many problems. case parametric estimation, Fisher information this.