Home
Course Guidelines
About the course Prerequite Material References
Python
Jupyter Notebooks Python overview
Exercises
Before the semester start: Installation and exercise setup Week 1: Introduction to Python and libraries Week 2: Vector representations Week 3: Linear Algebra Week 4: Linear Transformations Week 5: Models and least squares Week 6: Assignment 1 - Gaze Estimation Week 7: Model selection and descriptive statistics Week 8: Filtering Week 9: Classification Week 10: Evaluation Week 11: Dimensionality reduction Week 12: Clustering and refresh on gradients Week 13: Neural Networks Week 14: Convolutional Neural Networks (CNN's)
Tutorials
Week 1: Data analysis, manipulation and plotting Week 2: Linear algebra Week 3: Transformations tutorial Week 4: Projection and Least Squares tutorial Week 7: Cross-validation and descriptive statistics tutorial Week 8: Filtering tutorial Week 11: Gradient Descent / Ascent
In-class Exercises
In-class 1 In-class 2 In-class 10 In-class 3 In-class 4 In-class 8
Explorer

Document

  • Overview
  • 1. Tutorial week 4: Projection and Least Squares tutorial
  • 2. Experimenting with the tutorial
  • 3. Learning an Affine 2D Transformation

Content

  • Experiments for the tutorial notebo…
    • Task 1 Copy notebook
    • Task 2 Projection experiments
    • Task 3 Linear Least Squares Experiments
  • Pen and paper exercises
    • Task 4 Second-order polynomial
    • Task 5 Projection matrix

Experimenting with the tutorial

The following tasks are related to the week 4 tutorial. The individual tasks will ask you to either reflect on parts of the tutorial or modify specific code cells from the tutorial. Specifically, Task 2 and Task 3 require modifications to the code of your copy of the tutorial notebook.

List of tasks
  • Task 1: Copy notebook
  • Task 2: Projection experiments
  • Task 3: Linear Least Squares Experiments
  • Task 4: Second-order polynomial
  • Task 5: Projection matrix

Experiments for the tutorial notebook

Task 1: Copy notebook

Copy the tutorial notebook in the repository. This makes it easy to go back to the original in case something goes wrong.

Task 2: Projection experiments

This task builds on the $\textbf{Projections}$ section in the tutorial.

  1. Search and identify comment ##1 .
  2. Change the values of the matrix $A$ (below comment ##1 ) to modify the line. Experiment with different values and observe how the projection changes in the plot.
  3. Change the matrix $A$, such that $PX$ ≈ $X$ (that is the projection leaves $X$ almost unchanged).
  4. Search and identify comment ##2 .
  5. Set the matrix $A$ = $\begin{bmatrix} 1 \\ 0.5 \end{bmatrix}$, then apply the projection matrix $P$ twice, i.e. calculate $PPX$ (just below the comment). How does this affect the projected points?
# Write your solution here
# Write your solution here
Task 3: Linear Least Squares Experiments

This task builds on the $\textbf{Linear Least Squares}$ section in the tutorial.

  1. Search and identify comment ##3 .
  2. Change the values of the first point in the matrix X such that it gradually moves further and further away from the line. Observe how it affects the error $RMS$.
  3. Add two points to X and observe how they affect the fitted line and the error.
    • How can you change the two additional points so the fitted line does not move?
  4. What happens to the error when removing all but two points from X ?
  5. What happens when you remove all but one point from X ?
  6. Reflect on how the quality of the data affects the projection and thus the solution.
# Write your solution here
# Write your solution here

Pen and paper exercises

A 2. order polynomial is given by $$ f(x) = w_0 + w_1x + w_2x^2 = \sum^2_{i=0} w_ix^i. $$

Generally, an $N$'th order polynomial is given by

$$ f(x) = \sum^N_{i=0} w_ix^i, $$

where $\mathbb{w}$ is a vector of coefficients.

Task 4: Second-order polynomial
  1. Identify the knowns and unknowns in the polynomial above.
  2. Is the function linear or non-linear in $\mathbb{w}$?
  3. Is the function linear or non-linear in $\mathbb{x}$?
  4. Provide the outline of an algorithm for fitting a second-order polynomial using linear least squares.
  5. Generalize this algorithm to n-th order polynomials.
# Write your solution here
# Write your solution here
Task 5: Projection matrix

The projection matrix $P = A(A^\top A )^{-1}A^\top$ is, under certain conditions, equal to the identity matrix.

  1. Give an example of a design matrix $A$ for which $P=I$.
  2. Explain why projection matrices are usually not identity matrices.
  3. (optional) Prove a condition for which $P=I$. Hint: when is $A^\top A=I$?.
# Write your solution here
# Write your solution here