Assumptions of Classical Linear Regression Models (CLRM)

The following post will give a short introduction about the underlying assumptions of the classical linear regression model (OLS assumptions), which we derived in the following post. Given the  Gauss-Markov Theorem we know that the least squares estimator b_{0} and b_{1} are unbiased and have minimum variance among all unbiased linear estimators. The Gauss-Markov Theorem is telling us that in a regression model, where the expected value of our error terms is zero, E(\epsilon_{i}) = 0 and variance of the error terms is constant and finite \sigma^{2}(\epsilon_{i}) = \sigma^{2} \textless \infty and \epsilon_{i} and \epsilon_{j} are uncorrelated for all i and j the least squares estimator b_{0} and b_{1} are unbiased and have minimum variance among all unbiased linear estimators. (A detailed proof of the Gauss-Markov Theorem can be found here)

Continue reading Assumptions of Classical Linear Regression Models (CLRM)
Advertisements

The Gauss Markov Theorem

When studying the classical linear regression model, one necessarily comes across the Gauss-Markov Theorem. The Gauss-Markov Theorem is a central theorem for linear regression models. It states different conditions that, when met, ensure that your estimator has the lowest variance among all unbiased estimators. More formally, Continue reading The Gauss Markov Theorem

What is an indirect proof?

In economics, especially in theoretical economics, it is often necessary to formally prove your statements. Meaning to show your statements are correct in a logical way. One possible way of showing that your statements are correct is by providing an indirect proof. The following couple of lines try to explain the concept of indirect proof in a simple way.

Continue reading What is an indirect proof?

Relationship between Coefficient of Determination & Squared Pearson Correlation Coefficient

The usual way of interpreting the coefficient of determination R^{2} is to see it as the percentage of the variation of the dependent variable y (Var(y)) can be explained by our model. The exact interpretation and derivation of the coefficient of determination R^{2} can be found here.

Another way of interpreting the coefficient of determination R^{2} is to look at it as the Squared Pearson Correlation Coefficient between the observed values y_{i} and the fitted values  Continue reading Relationship between Coefficient of Determination & Squared Pearson Correlation Coefficient

The Coefficient Of Determination or R2

The coefficient of determination R^{2} shows how much of the variation of the dependent variable y (Var(y)) can be explained by our model. Another way of interpreting the coefficient of determination R^{2}, which will not be discussed in this post, is to look at it as the squared Pearson correlation coefficient between the observed values y_{i} and the fitted values \hat{y}_{i}. Why this is the case exactly can be found in another post.

Continue reading The Coefficient Of Determination or R2

Balance Statistic

The following article tries to explain the Balance Statistic sometimes referred to as Saldo or Saldo Statistic. It is used as a quantification method for qualitative survey question. The benefit of applying the Balance Statistic arises when the survey is repeated over time as it tracks changes in respondents answers in a comprehensible way. The Balance Statistic is common in Business Tendency Surveys.

Continue reading Balance Statistic

Advertisements

“In God we trust; all others must bring data.” W. Edwards Deming

Advertisements