Home projects readings blog about

Least Squares Regression

Published on Friday, 10 March, 2017 regression

Ordinary Least Squares (OLS)

Estimates the parameters in a regression model by minimizing the sum of squared residuals.

OLS Assumptions:

  1. Model is linear in the coefficients and the error term
  2. The error term has a population mean of zero
  3. All independent variables are uncorrelated with error term
  4. Observations of the error term are uncorrelated with each other
  5. The error term has a constant varinace (no heteroskedasticity)
  6. No independent variable is a perfect linear function of other explanatory variables
  7. The error term is normally distributed

Mathematical Form

Basically, given a vector of inputs we predict the output or endogenous variable via the following model.

$$\hat{Y} = \hat{B}0\sum^p{j=1}X_j\hat{\beta}_j$$

$$B_0 = intercept$$

Intercepts are the known bias inherent within a model.

Let's rewrite the model in linear model in vector form.

$$\hat{Y}=X^T\hat{\beta}$$

$$RSS(\beta)=\sum^N_{i=1}(y_i-x^T_i\beta)^2$$

References: