Proof Gauss Markov Theorem

From a previous posts on the Gauss Markov Theorem and OLS we know that the assumption of unbiasedness must full fill the following condition

(1) E(\hat{\beta}_{1}) = \beta_{0} \sum c_{i} + \beta_{1} \sum c_{i} \textbf{X}_{i} = \beta_{1}

which means that \sum c_{i} = 0 and \sum c_{i} \textbf{X}_{i} = 1.

Looking at the estimator of the variance for \beta_{1}

(2) \sigma^{2}(\hat{\beta}_{1})= \sum c_{i}^{2}\sigma^{2}(\textbf{Y}_{i})=\sigma^{2} \sum c_{i}^{2}

tells us that the estimator put additional restrictions on the c_{i}‘s.

To continue the proof we define c_{i} = k_{i}+d_{i}, where k_{i} are the constants we already defined above. d_{i} on the other hand are arbitrary constants. What we are basically saying is that c_{i} is composed of a constant k_{i} (see above for definition) and an arbitrary constant d_{i}. Let’s continue by plugging in the definition c_{i} = k_{i}+d_{i} in the equation for the variance estimator for \hat{\beta_{1}}.

(3) \sigma^{2}(\hat{\beta}_{1})= \sum c_{i}^{2}\sigma^{2}(\textbf{Y}_{i})

(4) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} \sum c_{i}^{2}

(5) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} \sum (k_{i}+d_{i})^{2}

(6) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} (\sum k_{i}^{2}+2\sum k_{i}d_{i}+\sum d_{i}^{2})

Note that Equation 6 is much more than just a transformation, it tells us that

(7) \sigma^{2} \sum k_{i}^{2} = \sigma^{2}(b_{1})

Now, why do we need equation 7? Well, it shows us the relationship between \sigma^{2}(\hat{\beta_{1}}) and \sigma^{2}(b_{1}). It basically shows, that \sigma^{2}(\hat{\beta_{1}}) is related to \sigma^{2}(b_{1}) plus some extra “stuff”.

Finally, what we need to do to terminate the proof is to show that some of this extra stuff (\sum k_{i}d_{i} is actually zero.

(8) \sum k_{i}d_{i} = \sum k_{i}(c_{i}-k_{i})

(9) \sum k_{i}d_{i} = \sum k_{i}c_{i} - \sum k_{i}^{2}

(10) \sum k_{i}d_{i} = \sum c_{i} \frac{X_{i} -\bar{X}}{\sum (X_{i} -\bar{X})^{2}} - \frac{1}{\sum(X_{i}-\bar{X})^{2}}

(11) \sum k_{i}d_{i} = \frac{\sum c_{i}X_{i} - \sum c_{i}\bar{X}}{\sum (X_{i} -\bar{X})^{2}} - \frac{1}{\sum(X_{i}-\bar{X})^{2}}

(12) \sum k_{i}d_{i} = \frac{\sum c_{i}X_{i} - \bar{X}\sum c_{i}}{\sum (X_{i} -\bar{X})^{2}} - \frac{1}{\sum(X_{i}-\bar{X})^{2}}

And as we know is \sum c_{i} = 0 and \sum c_{i} \textbf{X}_{i} = 1, which, when plugging into equation 21, shows us that the some of the extra “stuff” is actually zero. For the sake of completeness

(13) \sum k_{i}d_{i} = \frac{1 - \bar{X}0}{\sum (X_{i} -\bar{X})^{2}} - \frac{1}{\sum(X_{i}-\bar{X})^{2}}

(14) \sum k_{i}d_{i} = \frac{1}{\sum (X_{i} -\bar{X})^{2}} - \frac{1}{\sum(X_{i}-\bar{X})^{2}}=0

(15) \sum k_{i}d_{i} = 0

Having proofed that \sum k_{i}d_{i} = 0 reduces equation 15 to

(16) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} (\sum k_{i}^{2}+\sum d_{i}^{2})

(17) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} (\sum k_{i}^{2}) + \sigma^{2} (\sum d_{i}^{2})

Now we will show that \sigma^{2} (\sum k_{i}^{2}) = \sigma^{2}(b_{1}). Note that from equation 3 we know that

(18) \sigma^{2}(b_{1}) =\sigma^{2}(\frac{1}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}})

and from equation 1 we know that

(19) \textbf{k}_{i}=\frac{(\textbf{X}_{i}-\bar{\textbf{X}})}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}}

this lead us to

(20) \textbf{k}_{i}^{2}=\frac{(\textbf{X}_{i}-\bar{\textbf{X}})}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}} \frac{(\textbf{X}_{i}-\bar{\textbf{X}})}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}}

(21) \sum \textbf{k}_{i}^{2}= \sum \frac{(\textbf{X}_{i}-\bar{\textbf{X}})}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}} \frac{(\textbf{X}_{i}-\bar{\textbf{X}})}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}} = \frac{1}{\sum(\textbf{X}_{i}-\bar{\textbf{X}})^{2}}=b_{1}

Finally, plugging in equation 21 into equation 17 gives us

(22) \sigma^{2}(\hat{\beta}_{1})=\sigma^{2} (b_{1}) + \sigma^{2} (\sum d_{i}^{2})

So we are left with equation 22, which is minimized when d_{i} = 0 \forall i.

If d_{i} = 0 the c_{i} = k_{i}. This means that the least squares estimator b_{1} has minimum variance among all unbiased linear estimators.

Advertisement

One thought on “Proof Gauss Markov Theorem”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.