Over the past decade, deep learning models have achieved notable success across various domains such as computer vision, natural language understanding and speech processing, approaching or even surpassing human-level performances in many benchmark datasets. Yet deep learning keeps evolving and expanding into new frontiers. In this advanced seminar course, we’ll read and discuss a broad collection of papers on a wide variety of topics, including compositionality and systematic generalization, multimodal representation learning, memory and attention, graph neural networks, object-centric representation learning, dynamic networks, vision transformers, neural rendering, neural implicit representations, neuro-symbolic approaches, and deep implicit layers. The course also includes a project component, in which students will work alone or in pairs on a research topic covered in the class throughout the semester.
The structure of the course follows the format used in Colin Raffel’s COMP790 course at University of North Carolina and Alec Jacobson's CSC2521 course at the University of Toronto. Each lecture students will play a different role that defines how they will read the paper and focus on distinct points of view while preparing for the in-class discussion. See Presentations page for the details. This process is chosen to provide multiple perspectives, a thorough understanding of the concepts, and more importantly way more fun. The aim of this seminar is to bring students to the state of the art in this exciting field.
This class is taught by Aykut Erdem, and intended for graduate students and ambitious undergraduates with a research experience. To get the most out of the class, students should have a strong knowledge in deep learning (such as COMP541 or COMP547) and good programming skills. If you have doubts whether you meet these requirements, please consult the instructor in advance.
Instruction style: During the semester, students are responsible for studying and keeping up with the course material outside of class time. These may involve reading particular book chapters, papers or blogs and watching some video lectures.
Lectures: Monday and Wednesday at 11:30-12:40 (CASE Z25)
The course webpage will be updated regularly throughout the semester with lecture notes, presentations, and important deadlines. All other course related communications will be carried out through Blackboard.
Grading for COMP550 will be based on
Week | Topic | Assignments |
Sep 27-29 | Introduction to the course | |
Oct 4-6 | Compositionality and systematic generalization | |
Oct 11-13 | Multimodal representation learning | |
Oct 18-20 | Graph neural networks | Project proposals due |
Oct 25-27 | Object-centric representation learning | |
Nov 1-3 | Neuro-symbolic approaches | |
Nov 8-10 | Neural implicit representations | |
Nov 15-17 | Winter break | |
Nov 22-24 | Project progress presentations | Project progress reports due |
Nov 29-Dec 1 | Dynamic networks | |
Dec 6-8 | Neural rendering | |
Dec 13-15 | Memory and attention | |
Dec 20-22 | Vision Transformers | |
Dec 27-29 | Deep implicit layers | |
Jan 3-5 | Final Project Presentations | Final project reports due |