Theory of Neural Information Processing Systems

作者: P. Sollich , R. Kuhn , A. C. C. Coolen

DOI:

关键词: Unsupervised learningTheoretical computer scienceArtificial neural networkArtificial intelligenceStatistical inferenceTypes of artificial neural networksPerceptronDeep learningEntropy (information theory)MathematicsInformation theory

摘要: I INTRODUCTION TO NEURAL NETWORKS 1. General introduction 2. Layered networks 3. Recurrent with binary neurons II ADVANCED 4. Competitive unsupervised learning processes 5. Bayesian techniques in supervised 6. Gaussian 7. Support vector machines for classification III INFORMATION THEORY AND 8. Measuring information 9. Identification of entropy as an measure 10. Building blocks Shannon's theory 11. Information and statistical inference 12. Applications to neural IV MACROSCOPIC ANALYSIS OF DYNAMICS 13. Network operation: macroscopic dynamics 14. Dynamics online perceptrons 15. gradient descent V EQUILIBRIUM STATISTICAL MECHANICS 16. Basics equilibrium mechanics 17. analysis 18. Gardner task realizability APPENDICES A. Historical bibliographical notes B. Probability a nutshell C. Conditions central limit theorem apply D. Some simple summation identities E. integrals probability distributions F. Matrix G. The delta-distribution H. Inequalities based on convexity I. Metrics parametrized J. Saddle-point integration REFERENCES

参考文章(0)