Generalization performance of Bayes optimal classification algorithm for learning a perceptron.

作者: Manfred Opper , David Haussler

DOI: 10.1103/PHYSREVLETT.66.2677

关键词:

摘要: The generalization error of the Bayes optimal classification algorithm when learning a perceptron from noise-free random training examples is calculated exactly using methods statistical mechanics. It shown that if an assumption replica symmetry made, then, in thermodynamic limit, less than canonical stochastic algorithm, by factor approaching \ensuremath{\surd}2 as ratio number to weights grows. In addition, it approximations can be achieved algorithms use two-layer neutral net learn perceptron.

参考文章(11)
E Gardner, The space of interactions in neural network models Journal of Physics A. ,vol. 21, pp. 257- 270 ,(1988) , 10.1088/0305-4470/21/1/030
E Gardner, B Derrida, Optimal storage properties of neural network models Journal of Physics A. ,vol. 21, pp. 271- 284 ,(1988) , 10.1088/0305-4470/21/1/031
Eric B. Baum, David Haussler, What Size Net Gives Valid Generalization neural information processing systems. ,vol. 1, pp. 81- 90 ,(1988) , 10.1162/NECO.1989.1.1.151
D Hansel, H Sompolinsky, Learning from Examples in a Single-Layer Neural Network EPL. ,vol. 11, pp. 687- 692 ,(1990) , 10.1209/0295-5075/11/7/018
H. Sompolinsky, N. Tishby, H. S. Seung, Learning from examples in large neural networks. Physical Review Letters. ,vol. 65, pp. 1683- 1686 ,(1990) , 10.1103/PHYSREVLETT.65.1683
M Opper, W Kinzel, J Kleinz, R Nehl, On the ability of the optimal perceptron to generalise Journal of Physics A. ,vol. 23, ,(1990) , 10.1088/0305-4470/23/11/012
Géza Györgyi, Inference of a rule by a neural network with thermal noise. Physical Review Letters. ,vol. 64, pp. 2957- 2960 ,(1990) , 10.1103/PHYSREVLETT.64.2957
Tom M Mitchell, None, Generalization as search Artificial Intelligence. ,vol. 18, pp. 203- 226 ,(1982) , 10.1016/0004-3702(82)90040-6