Structural Complexity and Neural Networks

作者: Alberto Bertoni , Beatrice Palano

DOI: 10.1007/3-540-45808-5_21

关键词:

摘要: We survey some relationships between computational complexity and neural network theory. Here, only networks of binary threshold neurons are considered.We begin by presenting contributions in structural In parallel complexity, the class TCk0 problems solvable feed-forward with k levels a polynomial number is considered. Separation results recalled relation TC0 = ?TCk0 NC1 analyzed. particular, under conjecture TC ? NC1, we characterize regular languages accepted constant neurons.We also discuss use theory to study aspects learning combinatorial optimization context networks. consider PAC model learning, emphasizing negative based on theoretic assumptions. Finally, discussed realm related probabilistic characterization NP.

参考文章(72)
Donald O. Hebb, The organization of behavior Neurocomputing: foundations of research. pp. 43- 54 ,(1988)
Michael Randolph Garey, Johnson: "computers and intractability ,(1979)
Avrim L. Blum, Ronald L. Rivest, Original Contribution: Training a 3-node neural network is NP-complete Neural Networks. ,vol. 5, pp. 117- 127 ,(1992) , 10.1016/S0893-6080(05)80010-3
David E. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams, Learning representations by back-propagating errors Nature. ,vol. 323, pp. 696- 699 ,(1988) , 10.1038/323533A0
David S. JOHNSON, A catalog of complexity classes Handbook of theoretical computer science (vol. A). pp. 67- 161 ,(1991) , 10.1016/B978-0-444-88071-0.50007-2
Michael Randolph Garey, David S. Johnson, A guide to the theory of np-completeness ,(1978)
Michael A. Arbib, Brains, machines, and mathematics (2nd ed.) Springer-Verlag New York, Inc.. ,(1987)