Home
Course Guidelines
About the course Prerequite Material References Extra Material Errata to Video Welcome
Python
Jupyter Notebooks Python overview
Exercises
Before semester start: Installation Week 1: Introduction to Python and libraries Week 2: Vector representations Week 3: Linear Algebra Week 4: Linear Transformations Week 5: Models and least squares Week 6: Assignment 1 - Gaze Estimation Week 7: Model selection and descriptive statistics Week 8: Filtering Week 9: Classification Week 10: Evaluation Week 11: Dimensionality reduction Week 12: Clustering and refresh on gradients Week 13: Neural Networks Week 14: Convolutional Neural Networks (CNN's)
Tutorials
Week 2: Data analysis, manipulation and plotting Week 3: Linear algebra Week 4: Transformations tutorial Week 5: Projection and Least Squares tutorial Week 7: Cross-validation and descriptive statistics tutorial Week 8: Filtering tutorial Week 11: Gradient Descent / Ascent
In-class Exercises
In-class 1 In-class 2 In-class 10 In-class 3 In-class 4 In-class 8

Document

  • Overview

Content

  • Supplementary
  • Extra

Week 2: Data & Representations

  • Lecture 1: The geometry of linear equations
  • Condensed version of linear algebra applied to deep learning
  • Lecture 2: Elimination with matrices

Week 3: Transformations

📄 A 4-page overview of linear algebra

  • [ST]: Chapter 8.1, 8.2, 10.6 (more theoretical details)

  • [CO]: Chapter 3 (skip section 3.2.8 and 3.3.3) (Ignore at first view his comments on eigenvalues – we will discuss these later in the course.)

  • A 4-page overview of linear algebra (PDF)

  • Chapter 8.1, 8.2, 10.6 (more theoretical details)

  • Chapter 3 (skip sections 3.2.8 and 3.3.3)

  • Lecture 30: Linear transformations and their matrices

  • Linear Transformation between dimensions

  • Matrix Transpose: Visual intuition


Week 4: Least Squares

The Stang book contains numerous examples of the topics covered in the lecture notes ([DW]). Refer to these examples if you encounter any concepts that are unclear.

  • The Art of Linear Algebra (PDF)
  • Lecture 5 – Transposes, Permutations, Spaces Rⁿ
  • The Column Space of a Matrix (MIT 2020 Vision)
  • Lecture 15 – Projections onto Subspaces
    Essence of Linear Algebra – Projections

Week 5: Model Complexity and Generalization

(Coming soon...)

  • Kernels

Week 6: Understanding Data and Uncertainty

  • EliteDataScience Primer
  • RANSAC
    Ransac
  • Covariance Explanation
  • Multivariate Normal Distribution (Towards Data Science)
  • Regression (curvefitting) with noise
  • Tutorials and pdfs

Lecture 7: Noise and Filtering

Bias Variance on Kaltura


Lecture 8: Classification

  • Machine Learning – Week 3 (Coursera)

Lecture 9: Advanced Classification and Evaluation

  • Chapters 4.3 and 4.4 – Measures of Distance and Measures of Similarity
  • Tutorial on evaluation metrics (Part 1)
  • Part 2: Evaluation Metrics
  • Part 3: Evaluation Metrics
  • Matthews Correlation Coefficient – When to Use and When to Avoid
  • Precision-Recall Example – scikit-learn
  • Precision vs. Recall explained visually
  • F1 Score – clearly explained

Supplementary

  • Sections 3.2.4, 3.4, 3.5
  • Evaluation of models with class-imbalanced data

Lecture 10: Variability and PCA

  • Machine Learning: SVD & PCA (Jonathan Hui)
  • A Brief Introduction to Statistical Shape Analysis
  • Eigenvectors and Eigenvalues: A Deeper Understanding (Abdullah Bilal)
  • Chapters 7.1, 7.3
  • PCA Clearly Explained – When, Why, and How (TDS)

Extra

  • Eigenvectors and Eigenvalues (Duplicate – Abdullah Bilal)
  • OpenCV Eigenfaces for Face Recognition (PyImageSearch)

Missing PDF: "PCA clearly explained — When, Why, How to use it and feature importance" by Serafeim Loukas (Towards Data Science)


Lecture 11: PCA II and Clustering

Lecture 12: Regularization & Non-linear models

Lecture 13: Neural networks

Lecture 14: Neural architectures and Exam