This is a undergraduate-level introductory course in machine learning (ML) which will give a broad overview of many concepts and algorithms in ML, ranging from supervised learning methods such as support vector machines and decision trees, to unsupervised learning (clustering and factor analysis). The goal is to provide students with a deep understanding of the subject matter and skills to apply these concepts to real world problems. The course is taught by Aykut Erdem - the teaching assistant is Burcak Asal.
Lectures: Wednesdays at 09:00-09:50 and Fridays 09:00-10:50 (Room D4)
Tutorials: Tutorials: Mondays at 15:00-17:00 (Room D8)
Policies: All work on assignments must be done individually unless stated otherwise. You are encouraged to discuss with your classmates about the given assignments, but these discussions should be carried out in an abstract way. That is, discussions related to a particular solution to a specific problem (either in actual code or in the pseudocode) will not be tolerated.
In short, turning in someone else’s work, in whole or in part, as your own will be considered as a violation of academic integrity. Please note that the former condition also holds for the material found on the web as everything on the web has been written by someone else.
The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Piazza. Please enroll it by following the link https://piazza.com/hacettepe.edu.tr/fall2019/bbm406.
BBM406 is open to third/fourth-year undergraduate students. Non-CENG graduate students should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:
Grading for BBM406 will be based on
Grading for BBM409 will be based on
Date | Topic | Notes |
Oct 9 | Course outline and logistics, An overview of Machine Learning [slides] | Reading: The Discipline of Machine Learning, Tom Mitchell Video 1: The Master Algorithm, Pedro Domingos Video 2: The Thinking Machine Tutorial: Python/numpy |
Oct 11 | Machine Learning by Examples, Nearest Neighbor Classifier [slides] | Reading: Barber 1,14.1-14.2 Demo: k-Nearest Neighbors |
Oct 16 | Kernel Regression, Distance Functions, Curse of Dimensionality [slides] | Reading: Bishop 1.4, 2.5 |
Oct 18 | Linear Regression, Generalization, Model Complexity, Regularization [slides] | Assg1 out Reading: Bishop 1.1, 3.1, Stanford CS229 note Demo: Linear regression |
Oct 23 | Machine Learning Methodology [slides] | Reading: P. Domingos, A few useful things to know about machine learning |
Oct 25 | Learning Theory, Basic Probability Review [slides] | Reading: Daume III 12, Barber 1.1-1.4, CIS 520 note E. Simoncelli, A Geometric Review of Linear Algebra Video: Probability Primer Demo: Seeing Theory: A visual introduction to probability and statistics |
Oct 30 | Statistical Estimation: MLE [slides] | Reading: Murphy 2.1-2.3.2 Video: Daphne Koller, Probabilistic Graphical Models, MLE Lecture, MAP Lecture |
Nov 1 | Statistical Estimation: MAP, Naïve Bayes Classifier [slides] | Assg1 due Reading: Daume III 7, Naïve Bayes, Tom M. Mitchell Optional Reading: Learning to Decode Cognitive States from Brain Images, Tom M. Mitchell et al. Demo: Bayes Theorem |
Nov 6 | Logistic Regression, Discriminant vs. Generative Classification [slides] | Reading: SLP3 5 Optional Reading: On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes, Andrew Y. Ng, Michael I. Jordan |
Nov 8 | Linear Discriminant Functions, Perceptron [slides] | Assg2 out Reading: Bishop 4.1.1-4.1.2, 4.5, Daume III 3 |
Nov 13 | Multi-layer Perceptron [slides] | |
Nov 15 | Training Neural Networks: Computational Graph, Back-propagation [slides] | Course project proposal due Reading: CS 231 Backpropagation notes Demo: A Neural Network Playground |
Nov 20 | Introduction to Deep Learning [slides] | Reading: Deep Learning, Yann LeCun, Yoshio Bengio, Geoffrey Hinton |
Nov 22 | Deep Convolutional Networks [slides] | Assg2 due Reading: Conv Nets: A Modular Perspective, Understanding Convolutions, Christopher Olah |
Nov 27 | Support Vector Machines (SVMs) [slides] | Reading: Alpaydin 13.1-13.2 Video: Patrick Winston, Support Vector Machines |
Nov 29 | Soft margin SVM, Multi-class SVM [slides] | Assg3 out Reading: Alpaydin 13.3, 13.9, M.A. Hearst, Support Vector Machines, CS229 Notes 3.7 Demo: Multi-class SVM demo |
Dec 4 | Midterm review | |
Dec 6 | Midterm exam | |
Dec 11 | Kernels, Kernel Trick for SVMs, Support Vector Regression [slides] | Reading: 13.5-13.7, 13.10 |
Dec 13 | Decision Tree Learning [slides] | Assg3 due Reading: Mitchell 3, Bishop 14.4 Demo: A Visual Introduction to Machine Learning |
Dec 18 | Ensemble Methods: Bagging, Random Forests [slides] | Reading: Bishop 14.1-14.2, Understanding the Bias-Variance Tradeoff, Scott Fortmann-Roe, Random Forests, Leo Breiman and Adele Cutler Optional Reading: Real-Time Human Pose Recognition in Parts from Single Depth Images, Jamie Shotton et al. Demo: Bootstrapping |
Dec 20 | Ensemble Methods: Boosting [slides] | Project progress reports due Reading: Bishop 14.3 Optional Reading: Rapid Object Detection using a Boosted Cascade of Simple Features, Paul Viola and Michael Jones Video: A Boosting Tutorial, Robert Schapire |
Dec 25 | Clustering: K-Means [slides] | Reading: Bishop 9.1 Cluster Analysis: Basic Concepts and Algorithms, Pang-Ning Tan, Michael Steinbach and Vipin Kumar Demo: Visualizing K-Means equilibria |
Dec 27 | Clustering: Spectral Clustering, Agglomerative Clustering [slides] | |
Jan 1 | No class | |
Jan 3 | Dimensionality Reduction: PCA, SVD, ICA, Autoencoders [slides] | Reading: Bishop 12.1 Stanford CS229 note Optional Reading: Eigenfaces for Recognition, Matthew Turk and Alex Pentland Video: PCA, Andrew Ng Demo: Principal Component Analysis Explained Visually |
Jan 8 | Project presentations | |
Jan 10 | Project presentations (cont'd.), Course wrap-up | Final project reports due |