This course provides a thorough understanding of the fundamental concepts and recent advances in deep learning. The main objective is to provide students practical and theoretical foundations to use and develop deep neural architectures to solve challenging tasks in an end-to-end manner. The course is taught by Aykut Erdem. The teaching assistants are Andrew Bond, Hakan Capuk, and Omer Faruk Tal.
The course will use Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville as the textbook (draft available online and for free here).
Instruction style: During the semester, students are responsible for studying and keeping up with the course material outside of class time. These may involve reading particular book chapters, papers or blogs and watching some video lectures. After the first three lectures, each week a group of students will present a paper related to the topics of the previous week.
Lectures: Tuesday and Thursday at 16:00-17:10 (SNA 159)
Tutorials: Friday at 14:30-15:40 (SNA 159)
The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Blackboard.
COMP541 is open to all graduate students in our COMP department. Prospective senior undergraduate COMP students and non-COMP graduate students, however, should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:
Grading for COMP541 will be based on
Week | Topic | Assignments |
Week 1 | Introduction to Deep Learning | Self-Assessment Quiz (Theory) |
Week 2 | Machine Learning Overview | Self-Assessment Quiz (Programming) |
Week 3 | Multi-Layer Perceptrons | Assg 1 out: MLPs and Backpropagation |
Week 4 | Training Deep Neural Networks | |
Week 5 | Convolutional Neural Networks | Assg 1 in, Assg 2 out: CNNs |
Week 6 | Understanding and Visualizing CNNs | Project proposal due |
Week 7 | Recurrent Neural Networks | Assg 2 in, Assg 3 out: RNNs |
Week 8 | Attention and Transformers | |
Week 9 | Graph Neural Networks | Assg 3 in, Assg 4 out: Transformers and GNNs |
Week 10 | Language Model Pretraining | |
Week 11 | Project Progress Presentations | Project progress report due |
Week 12 | Large Language Models | Assg 4 in |
Week 13 | Adapting LLMs | |
Week 14 | Multimodal Pre-training | |
Week 15-16 | Final Project Presentations | Final project due |