From a previous posts on the Gauss Markov Theorem and OLS we know that the assumption of unbiasedness must full fill the following condition
(1)
which means that and
.
Looking at the estimator of the variance for
(2)
tells us that the estimator put additional restrictions on the ‘s.
To continue the proof we define , where
are the constants we already defined above.
on the other hand are arbitrary constants. What we are basically saying is that
is composed of a constant
(see above for definition) and an arbitrary constant
. Let’s continue by plugging in the definition
in the equation for the variance estimator for
.
(3)
(4)
(5)
(6)
Note that Equation 6 is much more than just a transformation, it tells us that
(7)
Now, why do we need equation 7? Well, it shows us the relationship between and
. It basically shows, that
is related to
plus some extra “stuff”.
Finally, what we need to do to terminate the proof is to show that some of this extra stuff ( is actually zero.
(8)
(9)
(10)
(11)
(12)
And as we know is and
, which, when plugging into equation 21, shows us that the some of the extra “stuff” is actually zero. For the sake of completeness
(13)
(14)
(15)
Having proofed that reduces equation 15 to
(16)
(17)
Now we will show that . Note that from equation 3 we know that
(18)
and from equation 1 we know that
(19)
this lead us to
(20)
(21)
Finally, plugging in equation 21 into equation 17 gives us
(22)
So we are left with equation 22, which is minimized when
.
If the
. This means that the least squares estimator
has minimum variance among all unbiased linear estimators.
One thought on “Proof Gauss Markov Theorem”