Course Information

About

This course provides a thorough understanding of the fundamental concepts and recent advances in deep learning. The main objective is to provide students practical and theoretical foundations to use and develop deep neural architectures to solve challenging tasks in an end-to-end manner. The course is taught by Aykut Erdem. The teaching assistants are Andrew Bond, Hakan Capuk, and Omer Faruk Tal.

                                                  

The course will use Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville as the textbook (draft available online and for free here).

Instruction style: During the semester, students are responsible for studying and keeping up with the course material outside of class time. These may involve reading particular book chapters, papers or blogs and watching some video lectures. After the first three lectures, each week a group of students will present a paper related to the topics of the previous week.

Time and Location

Lectures: Tuesday and Thursday at 16:00-17:10 (SNA 159)

Tutorials: Friday at 14:30-15:40 (SNA 159)

Communication

The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Blackboard.

Prerequisites

COMP541 is open to all graduate students in our COMP department. Prospective senior undergraduate COMP students and non-COMP graduate students, however, should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:

  • Programming (you should be a proficient programmer to work out the practicals and to implement your course project.)
  • Calculus (differentiation, chain rule) and Linear Algebra (vectors, matrices, eigenvalues/vectors)
  • Basic Probability and Statistics (random variables, expectations, multivariate Gaussians, Bayes rule, conditional probabilities)
  • Machine Learning (supervised and unsupervised learning, linear regression, overfitting, underfitting, regularization, bias vs variance tradeoff)
  • Optimization (cost functions, taking gradients, regularization)

Course Requirements and Grading

Grading for COMP541 will be based on

  • Self-Assessment Quiz (2%)
  • Assignments (20%) (4 assignments x 5% each)
  • Midterm Exam (17%) (exam guide)
  • Course Project (presentations and reports) (36%),
  • Paper Presentations (10%),
  • Paper Reviews (5%),
  • Class Participation (10%)

Reference Books

  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, MIT Press, 2016 (draft available online)

Schedule

Week Topic Assignments
Week 1 Introduction to Deep Learning Self-Assessment Quiz (Theory)
Week 2 Machine Learning Overview Self-Assessment Quiz (Programming)
Week 3 Multi-Layer Perceptrons Assg 1 out: MLPs and Backpropagation
Week 4 Training Deep Neural Networks
Week 5 Convolutional Neural Networks Assg 1 in, Assg 2 out: CNNs
Week 6 Understanding and Visualizing CNNs Project proposal due
Week 7 Recurrent Neural Networks Assg 2 in, Assg 3 out: RNNs
Week 8 Attention and Transformers
Week 9 Graph Neural Networks Assg 3 in, Assg 4 out: Transformers and GNNs
Week 10 Language Model Pretraining
Week 11 Project Progress Presentations Project progress report due
Week 12 Large Language Models Assg 4 in
Week 13 Adapting LLMs
Week 14 Multimodal Pre-training
Week 15-16 Final Project Presentations Final project due
Detailed Syllabus and Lectures