Course Information


This course provides a thorough understanding of the fundamental concepts and recent advances in deep learning. The main objective is to provide students practical and theoretical foundations to use and develop deep neural architectures to solve challenging tasks in an end-to-end manner. The course is taught by Aykut Erdem.

The course will use Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville as the textbook (draft available online and for free here).

Instruction style: During the semester, students are responsible for studying and keeping up with the course material outside of class time. These may involve reading particular book chapters, papers or blogs and watching some video lectures. After the first three lectures, each week a group of students will present a paper related to the topics of the previous week.

Time and Location

Lectures: Mondays and Wednesday at 11:30-12:40 (Tower Second Floor)

Tutorials: Monday at 17:30-18:40 (SCI Z24)


The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Piazza. Please enroll it by following the link


COMP541 is open to all graduate students in our COMP department. Prospective senior undergraduate COMP students and non-COMP graduate students, however, should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:

  • Programming (you should be a proficient programmer to work out the practicals and to implement your course project.)
  • Calculus (differentiation, chain rule) and Linear Algebra (vectors, matrices, eigenvalues/vectors)
  • Basic Probability and Statistics (random variables, expectations, multivariate Gaussians, Bayes rule, conditional probabilities)
  • Machine Learning (supervised and unsupervised learning, linear regression, overfitting, underfitting, regularization, bias vs variance tradeoff)
  • Optimization (cost functions, taking gradients, regularization)

Course Requirements and Grading

Grading for COMP541 will be based on

  • Self-Assessment Quiz (2%)
  • Assignments (20%) (4 assignments x 5% each)
  • Midterm Exam (21%)
  • Course Project (presentations and reports) (32%),
  • Paper Presentations (10%),
  • Paper Reviews (5%),
  • Class Participation (10%)

Reference Books

  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, MIT Press, 2016 (draft available online)


Date Topic Assignments
Oct 3 Introduction to Deep Learning Self-Assessment Quiz (Theory)
Oct 10 Machine Learning Overview Self-Assessment Quiz (Programming)
Oct 17 Multi-Layer Perceptrons Assg 1 out: MLPs and Backpropagation
Oct 24 Training Deep Neural Networks
Oct 31 Convolutional Neural Networks Assg 1 due, Assg 2 out: CNNs
Nov 7 Understanding and Visualizing CNNs Project proposal due
Nov 14 Winter Break
Nov 21 Recurrent Neural Networks Assg 2 due, Assg 3 out: RNNs
Nov 28 Attention and Transformers
TBA Midterm Exam
Dec 5 Graph Neural Networks Assg 3 due, Assg 4 out: Transformers and GNNs
Dec 12 Autoencoders and Autoregressive Models Project progress report due
Dec 19 Generative Adversarial Networks Assg 4 due
Dec 26 Variational Autoencoders
Jan 2 Self-supervised Learning
Jan 9 Deep Neural Networks as Priors
Jan 16 Final Project Presentations
Jan 23 Final Project Presentations Final project due
Detailed Syllabus and Lectures