Lesson 2 of 15
Gradient Descent
Gradient Descent
Gradient descent is the optimization algorithm that trains linear regression. It iteratively adjusts and to minimise the MSE loss.
The Gradients
Given predictions and true labels , the gradients of MSE with respect to and are:
Parameter Update
After computing the gradients, each parameter is updated by a small step in the negative gradient direction:
where is the learning rate — a small positive number like .
Training Loop
Repeat for many iterations:
- Compute predictions
- Compute gradients ,
- Update and
Your Task
Implement:
gradient_w(x, y_pred, y_true)— gradient of MSE w.r.t.gradient_b(y_pred, y_true)— gradient of MSE w.r.t.update_param(param, grad, lr)— one gradient descent step
Python runtime loading...
Loading...
Click "Run" to execute your code.