Lesson 3 of 15

Multivariate Regression

Multivariate Regression

Real datasets have multiple features. Multivariate linear regression extends the single-variable case to dd input features:

y^=wx+b=j=1dwjxj+b\hat{y} = \mathbf{w} \cdot \mathbf{x} + b = \sum_{j=1}^{d} w_j x_j + b

where w=[w1,,wd]\mathbf{w} = [w_1, \ldots, w_d] is the weight vector and x=[x1,,xd]\mathbf{x} = [x_1, \ldots, x_d] is the feature vector.

Dot Product

The core operation is the dot product:

wx=j=1dwjxj\mathbf{w} \cdot \mathbf{x} = \sum_{j=1}^{d} w_j x_j

Multivariate MSE

For a dataset {(x(i),y(i))}i=1n\{(\mathbf{x}^{(i)}, y^{(i)})\}_{i=1}^n:

MSE=1ni=1n(y^(i)y(i))2\text{MSE} = \frac{1}{n} \sum_{i=1}^{n} \left( \hat{y}^{(i)} - y^{(i)} \right)^2

where each y^(i)=wx(i)+b\hat{y}^{(i)} = \mathbf{w} \cdot \mathbf{x}^{(i)} + b.

Your Task

Implement:

  • dot_product(x, w) — sum of element-wise products
  • predict_multi(x, w, b) — dot product plus bias
  • mse_multi(X, w, b, y_true) — MSE over a dataset of feature vectors
Python runtime loading...
Loading...
Click "Run" to execute your code.