On PAC learning using Winnow, Perceptron, and a Perceptron-like algorithm

作者: Rocco A. Servedio

DOI: 10.1145/307400.307474

关键词:

摘要: In this paper we analyze the PAC learning abilities of several simple iterative algorithms for linear threshold functions, obtaining both positive and negative results. We show that Littlestone’s Winnow algorithm is not an efficient class functions. also prove Perceptron cannot efficiently learn unrestricted functions even under uniform distribution on boolean examples. However, can nested (a concept known to be hard arbitrary distributions) Finally, give a very Perceptron-like origin-centered halfspaces unit sphere in Unlike algorithm, which presence classification noise, new monotonic noise generalization noise). The significantly faster than previous settings.

参考文章(41)
Wolfgang Maass, György Turán, How fast can a threshold gate learn conference on learning theory. pp. 381- 414 ,(1994)
Ian Parberry, None, Circuit complexity and neural networks ,(1994)
Wolfgang Maass, Manfred K. Warmuth, Efficient Learning with Virtual Threshold Gates Machine Learning Proceedings 1995. pp. 378- 386 ,(1995) , 10.1016/B978-1-55860-377-6.50054-2
Umesh V. Vazirani, Michael J. Kearns, An Introduction to Computational Learning Theory ,(1994)
Michael L. Dertouzos, Threshold Logic: A Synthesis Approach ,(1965)
Martin Anthony, Graham Brightwell, John Shawe-Taylor, On specifying Boolean functions by labelled examples Discrete Applied Mathematics. ,vol. 61, pp. 1- 25 ,(1995) , 10.1016/0166-218X(94)00007-Z
Yoav Freund, Robert E. Schapire, Large margin classification using the perceptron algorithm conference on learning theory. ,vol. 37, pp. 209- 217 ,(1998) , 10.1145/279943.279985
Eric B. Baum, The Perceptron Algorithm Is Fast for Non-Malicious Distributions neural information processing systems. ,vol. 2, pp. 676- 685 ,(1989) , 10.1162/NECO.1990.2.2.248