# Unbiased Estimator of Sample Variance – Vol. 2

Lately I received some criticism saying that my proof (link to proof) on the unbiasedness of the estimator for the sample variance strikes through its unnecessary length. Well, as I am an economist and love proofs which read like a book, I never really saw the benefit of bowling down a proof to a couple of lines. Actually, I hate it if I have to brew over a proof for an hour before I clearly understand what’s going on. However, in order to satisfy the need for mathematical beauty, I looked around and found the following proof which is way shorter than my original version.

In order to prove that the estimator of the sample variance is unbiased we have to show the following:

(1) $\mathbb E(S^2)= \mathbb E\left(\frac{\sum_{i=1}^n (X_i - \bar X)^2}{n-1}\right) = \sigma^{2}$

However, before getting really to it, let’s start with the usual definition of notation. So for this proof it is important to know that

(2) $X_1, X_2 , ..., X_n$ are independent observations from a population with mean $\mu$ and variance $\sigma^{2}$

(3) $\mathbb E(X_i) = \mu$

(4) $\mathbb Var(X_i)= \sigma^{2}$

(5) $\mathbb E(X^2) = \sigma^{2} + \mu^{2}$

(6) $\mathbb Var(X)=\mathbb E(X^2)-\mathbb [E(X)]^2$

(7) $\mathbb E(\bar{X}^2) = \frac{\sigma^2}{n} + \mu^2$

Let’s try to show that

(8) $\mathbb E(X^2) = \mathbb Var(X) + \mathbb E(X)^2$

To make my life easier, I will omit the limits of summation from now onwards, but let it be known that we are always summing from $1$ to $n$.

(9) $\mathbb E\left(\sum^{N}_{i=1} (X_i - \bar X)^2 \right) = \mathbb E\left(\sum^{N}_{i=1} X_{i}^2 - 2 \bar X \sum^{N}_{i=1} X_i + n \bar X^2 \right)$

(10) $\mathbb E\left(\sum^{N}_{i=1} (X_i - \bar X)^2 \right) = \sum^{N}_{i=1} \mathbb E(X_{i}^2) - \mathbb E\left(n \bar X^2 \right)$

(11) $\mathbb E\left(\sum^{N}_{i=1} (X_i - \bar X)^2 \right) = n \sigma^2 + n \mu^2 - \sigma^2 -n \mu^2$

(12) $\mathbb E\left(\sum^{N}_{i=1} (X_i - \bar X)^2 \right) = (n-1)\sigma^2$

Combining equiation 1 with equation 12 brings us to:

(13) $\mathbb E(S^2)= \mathbb E\left(\frac{\sum^{N}_{i=1} (X_i - \bar X)^2}{n-1}\right) = \frac{1}{n-1} \mathbb E\left(\sum^{N}_{i=1} (X_i - \bar X)^2 \right)$

(14) $\mathbb E(S^2) = \frac {(n-1)\sigma^2}{n-1} = \sigma^2$

Finally, we showed that the estimator for the population variance is indeed unbiased. If you are mathematically adept you probably had no problem to follow every single step of this proof. However, if you are like me and want to be taken by hand through every single step you can find the exhaustive proof here.

## 11 thoughts on “Unbiased Estimator of Sample Variance – Vol. 2”

1. Arash Sioofy says:

steps 8 and 10 are the same terms.

1. isidorebeautrelet says:

You are right! I am going to fix it!

2. isidorebeautrelet says:

Its done 🙂

2. sigmaoverrootn says:

“Finally, we showed that the estimator for the sample variance is indeed unbiased.”

we are trying to estimate an unknown population parameter namely ‘sigma^2’: population variance, with a known quantity that is ‘s^2’: sample variance
therefore, ‘s^2’ is an estimator for ‘sigma^2’
the conclusion should be:
“the estimator for population variance is indeed unbiased”

1. ad says:

Hello and thank you for your very useful comment. I will definively consider it. Cheers!

3. Maurice says:

Hi! I don’t get how the assumptions (5) and (7) are justified. Can someone help?

1. ad says:

Hi. These are just some basic variance properties. You can find them on Wikipedia under https://en.wikipedia.org/wiki/Variance. Hope it helps.

4. FC says:

1. ad says: