Lesson 11 of 15

Linear Regression (OLS)

Ordinary Least Squares (OLS)

OLS regression fits a line y^=βx+α\hat{y} = \beta x + \alpha that minimizes the sum of squared residuals.

The closed-form solution for the slope is: β=i=1n(xixˉ)(yiyˉ)i=1n(xixˉ)2=SSxySSxx\beta = \frac{\sum_{i=1}^{n}(x_i - \bar{x})(y_i - \bar{y})}{\sum_{i=1}^{n}(x_i - \bar{x})^2} = \frac{SS_{xy}}{SS_{xx}}

And the intercept: α=yˉβxˉ\alpha = \bar{y} - \beta \bar{x}

where SSxy=xiyinxˉyˉSS_{xy} = \sum x_i y_i - n \bar{x} \bar{y} and SSxx=xi2nxˉ2SS_{xx} = \sum x_i^2 - n \bar{x}^2.

OLS is used everywhere in quantitative finance: factor models, risk attribution, and return prediction all rely on OLS regression.

Your Task

Implement ols(xs, ys) which returns a tuple (slope, intercept).

Python runtime loading...
Loading...
Click "Run" to execute your code.