# Assumptions of Classical Linear Regression Models (CLRM)

The following post will give a short introduction about the underlying assumptions of the classical linear regression model (OLS assumptions), which we derived in the following post. Given the  Gauss-Markov Theorem we know that the least squares estimator $b_{0}$ and $b_{1}$ are unbiased and have minimum variance among all unbiased linear estimators. The Gauss-Markov Theorem is telling us that in a regression model, where the expected value of our error terms is zero, $E(\epsilon_{i}) = 0$ and variance of the error terms is constant and finite $\sigma^{2}(\epsilon_{i}) = \sigma^{2} \textless \infty$ and $\epsilon_{i}$ and $\epsilon_{j}$ are uncorrelated for all $i$ and $j$ the least squares estimator $b_{0}$ and $b_{1}$ are unbiased and have minimum variance among all unbiased linear estimators. (A detailed proof of the Gauss-Markov Theorem can be found here)

Continue reading Assumptions of Classical Linear Regression Models (CLRM)