Lesson 1 of 15

Linear Regression

Linear Regression

Linear regression is the simplest supervised learning model. It learns a linear mapping from an input xx to an output yy:

y^=wx+b\hat{y} = wx + b

where ww is the weight (slope) and bb is the bias (intercept).

Mean Squared Error

To measure how well the model fits the data, we use the Mean Squared Error (MSE):

MSE=1ni=1n(y^iyi)2\text{MSE} = \frac{1}{n} \sum_{i=1}^{n} (\hat{y}_i - y_i)^2

A perfect model has MSE = 0. Higher MSE means worse predictions.

R-Squared

The coefficient of determination R2R^2 measures how much variance in yy is explained by the model:

R2=1SSresSStotR^2 = 1 - \frac{SS_{\text{res}}}{SS_{\text{tot}}}

where:

  • SSres=(y^iyi)2SS_{\text{res}} = \sum (\hat{y}_i - y_i)^2 — residual sum of squares
  • SStot=(yiyˉ)2SS_{\text{tot}} = \sum (y_i - \bar{y})^2 — total sum of squares

R2=1R^2 = 1 means a perfect fit; R2=0R^2 = 0 means the model is no better than predicting the mean.

Your Task

Implement:

  • predict(x, w, b) — returns wx+bwx + b
  • mse_loss(y_pred, y_true) — mean of squared differences
  • r_squared(y_pred, y_true) — coefficient of determination
Python runtime loading...
Loading...
Click "Run" to execute your code.