Home
Course Guidelines
About the course Prerequite Material References
Python
Jupyter Notebooks Python overview
Exercises
Before the semester start: Installation and exercise setup Week 1: Introduction to Python and libraries Week 2: Vector representations Week 3: Linear Algebra Week 4: Linear Transformations Week 5: Models and least squares Week 6: Assignment 1 - Gaze Estimation Week 7: Model selection and descriptive statistics Week 8: Filtering Week 9: Classification Week 10: Evaluation Week 11: Dimensionality reduction Week 12: Clustering and refresh on gradients Week 13: Neural Networks Week 14: Convolutional Neural Networks (CNN's)
Tutorials
Week 1: Data analysis, manipulation and plotting Week 2: Linear algebra Week 3: Transformations tutorial Week 4: Projection and Least Squares tutorial Week 7: Cross-validation and descriptive statistics tutorial Week 8: Filtering tutorial Week 11: Gradient Descent / Ascent
In-class Exercises
In-class 1 In-class 2 In-class 10 In-class 3 In-class 4 In-class 8
Explorer

Document

  • Overview
  • 1. Fitting polynomial models
  • 2. Pen and paper exercises

Content

  • Task 1 Finishing in-class
  • 2nd-order Polynomial model
    • Task 2 Data preparation (data wrangling)
    • Task 3 Constructing the design matrix
    • Task 4 Inverting the design matrix
    • Task 5 Plotting
    • Task 6 Making predictions
  • 3rd-order Polynomial models
    • Task 7 Third order polynomials
    • Task 8 Plotting
    • Task 9 Making predictions

Polynomial models

In this exercise you will implement a method to estimate the model parameters of single variate (single input) 2-nd and 3-rd order polynomials and use these models to predict a label for new datapoints. Next week you will extend this method to $n$-th order, single variate, polynomials in a fairly straightforward manner.

List of individual tasks
  • Task 1: Finishing in-class
  • Task 2: Data preparation (data wrangling)
  • Task 3: Constructing the design matrix
  • Task 4: Inverting the design matrix
  • Task 5: Plotting
  • Task 6: Making predictions
  • Task 7: Third order polynomials
  • Task 8: Plotting
  • Task 9: Making predictions
Task 1: Finishing in-class
  1. Ensure that you have finished all of the tasks from the W3 in-class exercise , as you will need them for the following tasks.

2nd-order Polynomial model

A 2nd-order polynomial model, also known as a quadratic model, can be expressed as:

$$ y = w_0 + w_1 x + w_2 x^2 $$

This model assumes that the relationship between the input $x$ and the label $y$ is non-linear and quadratic. This way a curved relationship can be modelled which can capture patterns that a simple linear model might miss.

By fitting this model, the aim is to find the parameters $w_0$, $w_1$, and $w_2$ that minimize the difference between the predicted values and the actual data, providing a more flexible fit compared to a linear model.

Run the cell below to load libraries and construct the datasets:

import numpy as np import matplotlib.pyplot as plt quadratic_dataset_1 = np.array([[1, 2], [2, 3], [3, 6]]) quadratic_dataset_2 = np.array([[9, 3], [7, 5], [1, 9]]) quadratic_dataset_3 = np.array([[8, 4], [10, 5], [3, 1]])
import numpy as np
import matplotlib.pyplot as plt

quadratic_dataset_1 = np.array([[1, 2], 
                                [2, 3], 
                                [3, 6]])

quadratic_dataset_2 = np.array([[9, 3], 
                                [7, 5], 
                                [1, 9]])

quadratic_dataset_3 = np.array([[8, 4], 
                                [10, 5], 
                                [3, 1]])

Throughout the next tasks, the aim is to construct the design matrix using the known variables, followed by learning the model parameters (solving) using the inverse of the design matrix. The subsequent tasks involve using the learnt model parameters to predict new unseen inputs and plotting the results.

Task 2: Data preparation (data wrangling)
  1. For each dataset identify which values correspond to inputs and which are the labels.
  2. Complete the separate_inputs_labels function below. The function takes a dataset as input and returns the inputs as the X_quadratic matrix and the labels as the y_quadratic vector.
def separate_inputs_labels(dataset): """ This function takes a dataset as input and returns the inputs and labels. Parameters: dataset (numpy array): The dataset to be separated. Returns: X_quadratic (3x1 numpy array): The input matrix. y_quadratic (1x3 numpy array): The labels vector. """ ... return X_quadratic, y_quadratic # Applying the function to each quadratic dataset X1_quadratic, y1_quadratic = separate_inputs_labels(quadratic_dataset_1) X2_quadratic, y2_quadratic = separate_inputs_labels(quadratic_dataset_2) X3_quadratic, y3_quadratic = separate_inputs_labels(quadratic_dataset_3) print("X1_quadratic: \n", X1_quadratic) print("y1_quadratic: \n", y1_quadratic)
def separate_inputs_labels(dataset):
    """
    This function takes a dataset as input and returns the inputs and labels.
    
    Parameters:
    dataset (numpy array): The dataset to be separated.
    
    Returns:
    X_quadratic (3x1 numpy array): The input matrix.
    y_quadratic (1x3 numpy array): The labels vector.
    """
    
    ...
    return X_quadratic, y_quadratic

# Applying the function to each quadratic dataset
X1_quadratic, y1_quadratic = separate_inputs_labels(quadratic_dataset_1)
X2_quadratic, y2_quadratic = separate_inputs_labels(quadratic_dataset_2)
X3_quadratic, y3_quadratic = separate_inputs_labels(quadratic_dataset_3)

print("X1_quadratic: \n", X1_quadratic)
print("y1_quadratic: \n", y1_quadratic)
X1_quadratic: 
 [[1]
 [2]
 [3]]
y1_quadratic: 
 [2 3 6]
Task 3: Constructing the design matrix

Reuse the code from the in-class exercise:

  1. Add the 2nd order polynomial term and construct the design matrix for each dataset.
  2. Print the results.
print("Design Matrix for Dataset 1:\n", X1_quadratic_design)
print("Design Matrix for Dataset 1:\n", X1_quadratic_design)
Design Matrix for Dataset 1:
 [[1. 1. 1.]
 [4. 2. 1.]
 [9. 3. 1.]]
Task 4: Inverting the design matrix
  1. Compute the inverse of each design matrix.
  2. Compute the model parameters (referenced as weigths in the cells below), then print the results.
print("Model parameters for Model 1:", weights1_quadratic)
print("Model parameters for Model 1:", weights1_quadratic)
Model parameters for Model 1: [ 1. -2.  3.]
Task 5: Plotting
  1. Use the plot_quadratic_model function to plot the results.
  2. Visually inspect the plots and discuss how the model parameters influence the shape and position of the fitted curve.
  3. Compare the results of the polynomial model to the outcome of the linear model implemented in the in-class exercise.
# Function to plot data points and fitted quadratic model def plot_quadratic_model(X, y, weights): # Plot the data points plt.scatter(X, y, color='blue', label='Given Points') # Extend x_vals range to include zero for correct visualization x_vals = np.linspace(0, max(X) + 1, 100) y_vals = weights[0] * x_vals**2 + weights[1] * x_vals + weights[2] # Plot the fitted polynomial plt.plot(x_vals, y_vals, color='red', label=f'Poly: y = {weights[0]:.2f}x^2 + {weights[1]:.2f}x + {weights[2]:.2f}') # Plot the y-intercept plt.scatter(0, weights[2], color='green', zorder=5, label=f'Y-intercept (0, {weights[2]:.2f})') # Add title and labels plt.title('Quadratic Model') plt.xlabel('X') plt.ylabel('y') plt.legend() plt.grid(True) plt.show() plot_quadratic_model(X1_quadratic, y1_quadratic, weights1_quadratic)
# Function to plot data points and fitted quadratic model
def plot_quadratic_model(X, y, weights):
    # Plot the data points
    plt.scatter(X, y, color='blue', label='Given Points')
    
    # Extend x_vals range to include zero for correct visualization
    x_vals = np.linspace(0, max(X) + 1, 100)
    y_vals = weights[0] * x_vals**2 + weights[1] * x_vals + weights[2]
    
    # Plot the fitted polynomial
    plt.plot(x_vals, 
             y_vals, 
             color='red', 
             label=f'Poly: y = {weights[0]:.2f}x^2 + {weights[1]:.2f}x + {weights[2]:.2f}')
    
    # Plot the y-intercept
    plt.scatter(0, weights[2], color='green', zorder=5, label=f'Y-intercept (0, {weights[2]:.2f})')
    
    # Add title and labels
    plt.title('Quadratic Model')
    plt.xlabel('X')
    plt.ylabel('y')
    plt.legend()
    plt.grid(True)
    plt.show()


plot_quadratic_model(X1_quadratic, y1_quadratic, weights1_quadratic)

To make predictions with a polynomial model, you apply the polynomial equation to the new input using the estimated model parameters. The model then calculates the predicted output based on the learned relationship between the variables.

Task 6: Making predictions

The cell below contains an array of new inputs. Follow these steps to predict a label for the new input:

  1. For each trained model, use the model parameters to predict a label for the new input. The model parameters for the different models should lead to different predictions. Store the predicted label as a separate variable.
  2. Plot the results using the plot_quadratic_model_with_predictions function.
  3. Compare the plots and the predicted labels obtained with the different models.
  4. Extend the new_inputs array with the points 1.5 and -1 . Follow the same steps as above to obtain labels for each input.
  5. Following the above steps should result in 3 predicted labels for each model. A total of 9 predicted values should be submitted as part of the mandatory activity on Grasple
# New array of inputs for prediction, currently containing a single element new_input = np.array([14]) def plot_quadratic_model_with_predictions(X, y, weights, new_input, predicted_labels): # Plot the original data points plt.scatter(X, y, color='blue', label='Given Points') # Plot the fitted quadratic model x_vals = np.linspace(min(X.min(), new_input.min()), max(X.max(), new_input.max()), 100) y_vals = weights[0] * x_vals**2 + weights[1] * x_vals + weights[2] plt.plot( x_vals, y_vals, color='red', label=( f'Poly: y = {weights[0]:.2f}x^2 + {weights[1]:.2f}x + ' f'{weights[2]:.2f}' )) # Plot the new inputs and their predicted labels plt.scatter(new_input, predicted_labels, color='orange', label='Predicted Points') # Add title, labels, and legend plt.title('Quadratic Model with Predictions') plt.xlabel('X') plt.ylabel('y') plt.legend() plt.grid(True) plt.show()
# New array of inputs for prediction, currently containing a single element
new_input = np.array([14])

def plot_quadratic_model_with_predictions(X, y, weights, new_input, predicted_labels):
    # Plot the original data points
    plt.scatter(X, y, color='blue', label='Given Points')

    # Plot the fitted quadratic model
    x_vals = np.linspace(min(X.min(), new_input.min()), max(X.max(), new_input.max()), 100)
    y_vals = weights[0] * x_vals**2 + weights[1] * x_vals + weights[2]
    plt.plot(
    x_vals, 
    y_vals, 
    color='red', 
    label=(
        f'Poly: y = {weights[0]:.2f}x^2 + {weights[1]:.2f}x + '
        f'{weights[2]:.2f}'
    ))

    # Plot the new inputs and their predicted labels
    plt.scatter(new_input, predicted_labels, color='orange', label='Predicted Points')

    # Add title, labels, and legend
    plt.title('Quadratic Model with Predictions')
    plt.xlabel('X')
    plt.ylabel('y')
    plt.legend()
    plt.grid(True)
    plt.show()

3rd-order Polynomial models

This exercise is about third order polynomials which offer an increased flexibility in modelling data compared to 1. and 2. order polynomials by introducing a cubic term $x^3$.

A 3rd-order polynomial model is given by

$$ y = w_0 + w_1 x + w_2 x^2 + w_3 x^3 $$

In the next tasks you will follow similar steps as 1. and 2. order polynomial. This includes preparing the dataset, constructing the design matrix, calculating the model weights, plotting the results, and using the model to make predictions for new inputs.

Run the cell below to construct a new dataset:

cubic_dataset = np.array([[7, 6], [5, 24], [8, 60], [1, 120]])
cubic_dataset = np.array([[7, 6], [5, 24], [8, 60], [1, 120]])
Task 7: Third order polynomials
  1. For each dataset identify which values correspond to inputs and which are the labels.
  2. Create a $4 \times 1$ matrix called X_cubic that contain the inputs, and a vector called y_cubic containing the corresponding labels.
  3. Create the design matrix, remember to add the cubic and quadratic term.
  4. Calculate the inverse of the design matrix.
  5. Follow the same procedure as previous to compute the model parameters.
# Write your solution here
# Write your solution here
Task 8: Plotting
  1. Use the plot_cubic_model function to plot the results.
  2. Visually inspect the plots and discuss how the model parameters influence the shape and position of the fitted curve.
  3. Compare the current results of the 3rd order polynomial to the previous 1st and 2nd order models. Based on the plots, which model shows the best fit?
# Function to plot data points and fitted cubic model def plot_cubic_model(X, y, weights): # Plot the data points plt.scatter(X, y, color='blue', label='Given Points') # Extend x_vals range to include zero for correct visualization x_vals = np.linspace(0, max(X) + 1, 100) y_vals = weights[0] * x_vals**3 + weights[1] * x_vals**2 + weights[2] * x_vals + weights[3] # Plot the fitted polynomial plt.plot( x_vals, y_vals, color='red', label=( f'Poly: y = {weights[0]:.2f}x^3 + {weights[1]:.2f}x^2 + ' f'{weights[2]:.2f}x + {weights[3]:.2f}' )) # Plot the y-intercept plt.scatter(0, weights[3], color='green', zorder=5, label=f'Y-intercept (0, {weights[3]:.2f})') # Add title and labels plt.title('Cubic Model') plt.xlabel('X') plt.ylabel('y') plt.legend() plt.grid(True) plt.xlim([-0.25, max(X) + 1]) # Ensure the x-axis starts from 0 plt.show()
# Function to plot data points and fitted cubic model
def plot_cubic_model(X, y, weights):
    # Plot the data points
    plt.scatter(X, y, color='blue', label='Given Points')
    
    # Extend x_vals range to include zero for correct visualization
    x_vals = np.linspace(0, max(X) + 1, 100)
    y_vals = weights[0] * x_vals**3 + weights[1] * x_vals**2 + weights[2] * x_vals + weights[3]
    
    # Plot the fitted polynomial
    plt.plot(
    x_vals, 
    y_vals, 
    color='red', 
    label=(
        f'Poly: y = {weights[0]:.2f}x^3 + {weights[1]:.2f}x^2 + '
        f'{weights[2]:.2f}x + {weights[3]:.2f}'
    ))
    
    # Plot the y-intercept
    plt.scatter(0, weights[3], color='green', zorder=5, label=f'Y-intercept (0, {weights[3]:.2f})')
    
    # Add title and labels
    plt.title('Cubic Model')
    plt.xlabel('X')
    plt.ylabel('y')
    plt.legend()
    plt.grid(True)
    plt.xlim([-0.25, max(X) + 1])  # Ensure the x-axis starts from 0
    plt.show()
Task 9: Making predictions

In this task, the learned 3rd order polynomial model is used to predict new labels for the new_inputs array defined below.

  1. Use the previously obtained cubic model parameters to predict labels for the new inputs.
  2. Plot the results using the plot_cubic_model_with_predictions function.
  3. Compare the predictions of the cubic model with the predicitons of the quadratic models.
  4. Which model is preffered in terms of representing the relationship between inputs and outputs and why?
new_inputs = np.array([14, 1.5, -1]) # Function to plot data points, fitted cubic model, and predictions def plot_cubic_model_with_predictions(X, y, weights, new_inputs, predicted_labels): # Plot the original data points plt.scatter(X, y, color='blue', label='Given Points') # Plot the fitted cubic model x_vals = np.linspace(min(min(X), min(new_inputs)), max(max(X), max(new_inputs)), 100) y_vals = weights[0] * x_vals**3 + weights[1] * x_vals**2 + weights[2] * x_vals + weights[3] plt.plot( x_vals, y_vals, color='red', label=( f'Poly: y = {weights[0]:.2f}x^3 + {weights[1]:.2f}x^2 + ' f'{weights[2]:.2f}x + {weights[3]:.2f}' )) # Plot the new inputs and their predicted labels plt.scatter(new_inputs, predicted_labels, color='orange', label='Predicted Points') # Add title, labels, and legend plt.title('Learned Cubic Model with Predictions') plt.xlabel('X') plt.ylabel('y') plt.legend() plt.grid(True) plt.show()
new_inputs = np.array([14, 1.5, -1])

# Function to plot data points, fitted cubic model, and predictions
def plot_cubic_model_with_predictions(X, y, weights, new_inputs, predicted_labels):
    # Plot the original data points
    plt.scatter(X, y, color='blue', label='Given Points')

    # Plot the fitted cubic model
    x_vals = np.linspace(min(min(X), min(new_inputs)), max(max(X), max(new_inputs)), 100)
    y_vals = weights[0] * x_vals**3 + weights[1] * x_vals**2 + weights[2] * x_vals + weights[3]
    plt.plot(
    x_vals, 
    y_vals, 
    color='red', 
    label=(
        f'Poly: y = {weights[0]:.2f}x^3 + {weights[1]:.2f}x^2 + '
        f'{weights[2]:.2f}x + {weights[3]:.2f}'
    ))

    # Plot the new inputs and their predicted labels
    plt.scatter(new_inputs, predicted_labels, color='orange', label='Predicted Points')

    # Add title, labels, and legend
    plt.title('Learned Cubic Model with Predictions')
    plt.xlabel('X')
    plt.ylabel('y')
    plt.legend()
    plt.grid(True)
    plt.show()
#Write your reflection here...
#Write your reflection here...