# Least Squares Regression

Published on Friday, 10 March, 2017
regression

# Ordinary Least Squares (OLS)

Estimates the parameters in a regression model by minimizing the sum of squared residuals.

OLS Assumptions:

- Model is linear in the coefficients and the error term
- The error term has a population mean of zero
- All independent variables are uncorrelated with error term
- Observations of the error term are uncorrelated with each other
- The error term has a constant varinace (no heteroskedasticity)
- No independent variable is a perfect linear function of other explanatory variables
- The error term is normally distributed

## Mathematical Form

Basically, given a vector of inputs we predict the output or endogenous variable via the following model.

$$\hat{Y} = \hat{B}*0\sum^p*{j=1}X_j\hat{\beta}_j$$

$$B_0 = intercept$$

Intercepts are the known bias inherent within a model.

Let's rewrite the model in linear model in vector form.

$$\hat{Y}=X^T\hat{\beta}$$

$$RSS(\beta)=\sum^N_{i=1}(y_i-x^T_i\beta)^2$$

References: