which means that and .
Looking at the estimator of the variance for
tells us that the estimator put additional restrictions on the ‘s.
To continue the proof we define , where are the constants we already defined above. on the other hand are arbitrary constants. What we are basically saying is that is composed of a constant (see above for definition) and an arbitrary constant . Let’s continue by plugging in the definition in the equation for the variance estimator for .
Note that Equation 6 is much more than just a transformation, it tells us that
Now, why do we need equation 7? Well, it shows us the relationship between and . It basically shows, that is related to plus some extra “stuff”.
Finally, what we need to do to terminate the proof is to show that some of this extra stuff ( is actually zero.
And as we know is and , which, when plugging into equation 21, shows us that the some of the extra “stuff” is actually zero. For the sake of completeness
Having proofed that reduces equation 15 to
Now we will show that . Note that from equation 3 we know that
and from equation 1 we know that
this lead us to
Finally, plugging in equation 21 into equation 17 gives us
So we are left with equation 22, which is minimized when .
If the . This means that the least squares estimator has minimum variance among all unbiased linear estimators.