Lesson 5 of 15

Decision Boundary & Metrics

Decision Boundary & Classification Metrics

A logistic regression model outputs a probability p^(0,1)\hat{p} \in (0,1). To make a hard prediction (0 or 1) we apply a threshold τ\tau (usually 0.5):

y^={1if p^τ0otherwise\hat{y} = \begin{cases} 1 & \text{if } \hat{p} \geq \tau \\ 0 & \text{otherwise} \end{cases}

Evaluation Metrics

Once we have hard predictions we can measure quality with:

MetricFormulaMeaning
AccuracyTP+TNn\frac{TP + TN}{n}Fraction correct
PrecisionTPTP+FP\frac{TP}{TP + FP}Of predicted positives, how many are real?
RecallTPTP+FN\frac{TP}{TP + FN}Of real positives, how many did we catch?

where TP = true positives, TN = true negatives, FP = false positives, FN = false negatives.

Zero-division: if the denominator is zero, return 0.0.

Your Task

Implement:

  • classify(x, w, b, threshold=0.5) → 0 or 1
  • accuracy(y_pred, y_true) → fraction of correct predictions
  • precision(y_pred, y_true) → TP / (TP + FP), 0.0 on zero-division
  • recall(y_pred, y_true) → TP / (TP + FN), 0.0 on zero-division
Python runtime loading...
Loading...
Click "Run" to execute your code.