Computational Learning Theory and Neural Networks: A Survey of Selected Topics

作者: György Turán

DOI: 10.1007/978-1-4615-2696-4_7

关键词:

摘要: One of the central issues in neural computation is learning capability networks. For computational theory, which concerned with complexity processes general, problems related to networks serve both as a major motivation and testing ground. In this chapter we describe formal models studied theory results that either deal directly or have implications for them, providing an introduction topic subsequent chapters book. The next by W. Maass gives survey further important areas.

参考文章(85)
Wolfgang Maass, György Turán, How fast can a threshold gate learn conference on learning theory. pp. 381- 414 ,(1994)
Ian Parberry, None, Circuit complexity and neural networks ,(1994)
Henk C. A. van Tilborg, An Introduction to Cryptology ,(1988)
Tibor Hegedűs, On training simple neural networks and small-weight neurons european conference on computational learning theory. pp. 69- 82 ,(1994)
Michel Cosnard, Pascal Koiran, Hélène Paugam-Moisy, Complexity Issues in Neural Network Computations latin american symposium on theoretical informatics. pp. 530- 543 ,(1992) , 10.1007/BFB0023854
Avrim L. Blum, Ronald L. Rivest, Original Contribution: Training a 3-node neural network is NP-complete Neural Networks. ,vol. 5, pp. 117- 127 ,(1992) , 10.1016/S0893-6080(05)80010-3
John H. Reif, On threshold circuits and polynomial computation. structure in complexity theory annual conference. ,(1987)
David Haussler, Probably approximately correct learning national conference on artificial intelligence. pp. 1101- 1108 ,(1990)
D. Angluin, M. Frazier, L. Pitt, Learning conjunctions of Horn clauses foundations of computer science. pp. 186- 192 ,(1990) , 10.1109/FSCS.1990.89537