Home|Publications|Projects|Research Interests|Teaching|Students|Experience|Announcements|Contact


DEEP LEARNING
Description
Deep learning and machine learning basics, probability, distributions, information theory, artificial neural networks, autoencoders and applications, convolutional neural networks and applications, restricted Boltzmann machines, deep belief networks, recurrent neural networks and applications, generative adversarial networks, deep learning optimization.
Grading
Midterm - 25%
Homeworks - 15%
Final project - 30% 
Final exam - 30%
Textbook
(1) Deep Learning, I. Goodfellow, Y. Bengio and A. Courville, MIT Press, 2016.
Supplementary books
(1) Artificial Intelligence: A Modern Approach, S. Russell, and N. Norvig, Prentice Hall, 2003.
(2) The Elements of Statistical Learning, T. Hastie, R. Tibshirani and J. Friedman, Springer, 2001.
(3) Machine Learning: A Probabilistic Perspective, K. P. Murphy, MIT press, 2012.
Outline
(1) Introduction to deep learning 
(2) Machine learning basics 
(3) Artificial neural networks 
(4) Deep feedforward networks 
(5) Convolutional neural networks
(6) Recurrent neural networks
(7) Aautoencoders
(8) Restricted Boltzmann machines
(9) Deep belief networks
(10) Generative adversarial networks
(11) Deep learning optimization
General information about homeworks
- The articles reviewed will be published in the last 3 years. 
- The articles will be published in a journal indexed by SCI.
- The full text of the article will be attached to the prepared report.
- The students who will make presentations should send their presentation files by e-mail at the latest one day before.