The following post will give a short introduction about the underlying assumptions of the classical linear regression model (OLS assumptions), which we derived in the following post. Given the Gauss-Markov Theorem we know that the least squares estimator and
are unbiased and have minimum variance among all unbiased linear estimators. The Gauss-Markov Theorem is telling us that in a regression model, where the expected value of our error terms is zero,
and variance of the error terms is constant and finite
and
and
are uncorrelated for all
and
the least squares estimator
and
are unbiased and have minimum variance among all unbiased linear estimators. (A detailed proof of the Gauss-Markov Theorem can be found here)
Tag Archives: Unbiased
The Gauss Markov Theorem
When studying the classical linear regression model, one necessarily comes across the Gauss-Markov Theorem. The Gauss-Markov Theorem is a central theorem for linear regression models. It states different conditions that, when met, ensure that your estimator has the lowest variance among all unbiased estimators. More formally, Continue reading The Gauss Markov Theorem
Proof of Unbiasedness of Sample Variance Estimator
Proof of Unbiasness of Sample Variance Estimator
(As I received some remarks about the unnecessary length of this proof, I provide shorter version here)
In different application of statistics or econometrics but also in many other examples it is necessary to estimate the variance of a sample. The estimator of the variance, see equation (1) is normally common knowledge and most people simple apply it without any further concern. The question which arose for me was why do we actually divide by n-1 and not simply by n? In the following lines we are going to see the proof that the sample variance estimator is indeed unbiased.
Continue reading Proof of Unbiasedness of Sample Variance Estimator