Lesson 7 of 18

Gradient Magnitude

The Gradient

The gradient of f(x,y)f(x, y) is the vector of its partial derivatives:

f=(fx,  fy)\nabla f = \left(\frac{\partial f}{\partial x},\; \frac{\partial f}{\partial y}\right)

The Gradient Points Uphill

The gradient vector always points in the direction of steepest ascent. Its magnitude tells you how steep that ascent is.

f=(fx)2+(fy)2|\nabla f| = \sqrt{\left(\frac{\partial f}{\partial x}\right)^2 + \left(\frac{\partial f}{\partial y}\right)^2}

Key Facts

  • f=0\nabla f = \mathbf{0} at critical points (local minima, maxima, saddle points)
  • The gradient is perpendicular to level curves (contour lines)
  • Moving against the gradient is steepest descent — the basis of gradient descent in machine learning

Examples

For f(x,y)=x2+y2f(x,y) = x^2 + y^2:

  • f=(2x,  2y)\nabla f = (2x,\; 2y)
  • At (3,4)(3, 4): f=36+64=100=10|\nabla f| = \sqrt{36 + 64} = \sqrt{100} = 10

For f(x,y)=xyf(x,y) = xy:

  • f=(y,  x)\nabla f = (y,\; x)
  • At (1,1)(1, 1): f=1+1=21.4142|\nabla f| = \sqrt{1 + 1} = \sqrt{2} \approx 1.4142

Your Task

Implement double gradient_magnitude(double (*f)(double, double), double x, double y, double h) that computes f|\nabla f| at (x,y)(x, y) using central differences for the partial derivatives.

TCC compiler loading...
Loading...
Click "Run" to execute your code.