Basics of Deep Learning

作者: Uday Kamath , John Liu , James Whitaker , Uday Kamath , John Liu

DOI: 10.1007/978-3-030-14596-5_4

关键词: Academic communityField (Bourdieu)Artificial neural networkComputer scienceCognitive scienceArtificial intelligenceRepresentation (systemics)Deep learning

摘要: One of the most talked-about concepts in machine learning both academic community and media is evolving field deep learning. The idea neural networks, subsequently learning, gathers its inspiration from biological representation human brain (or any brained creature for that matter).

参考文章(31)
Matthew D. Zeiler, ADADELTA: An Adaptive Learning Rate Method arXiv: Learning. ,(2012)
Yoshua Bengio, Frederic Morin, Hierarchical Probabilistic Neural Network Language Model. international conference on artificial intelligence and statistics. ,(2005)
Christopher M. Bishop, Regularization and complexity control in feed-forward networks EC2 et Cie. ,vol. 1, pp. 141- 148 ,(1995)
B. Speelpenning, Compiling fast partial derivatives of functions given by algorithms University of Illinois at Urbana-Champaign. ,(1980) , 10.2172/5254402
Gérard Ben Arous, Anna Choromanska, Yann LeCun, Mikael Henaff, Michael Mathieu, The Loss Surfaces of Multilayer Networks international conference on artificial intelligence and statistics. ,vol. 38, pp. 192- 204 ,(2015)
Katta G. Murty, Santosh N. Kabadi, Some NP-complete problems in quadratic and nonlinear programming Mathematical Programming. ,vol. 39, pp. 117- 129 ,(1987) , 10.1007/BF02592948
Ilya Sutskever, Geoffrey Hinton, Alex Krizhevsky, Ruslan Salakhutdinov, Nitish Srivastava, Dropout: a simple way to prevent neural networks from overfitting Journal of Machine Learning Research. ,vol. 15, pp. 1929- 1958 ,(2014)
, Generative Adversarial Nets neural information processing systems. ,vol. 27, pp. 2672- 2680 ,(2014) , 10.3156/JSOFT.29.5_177_2
Geoffrey E Hinton, Ruslan R Salakhutdinov, Reducing the Dimensionality of Data with Neural Networks Science. ,vol. 313, pp. 504- 507 ,(2006) , 10.1126/SCIENCE.1127647