The following post will give a short introduction about the underlying assumptions of the classical linear regression model (OLS assumptions), which we derived in the following post. Given the Gauss-Markov Theorem we know that the least squares estimator and are unbiased and have minimum variance among all unbiased linear estimators. The Gauss-Markov Theorem is telling us that in a regression model, where the expected value of our error terms is zero, and variance of the error terms is constant and finite and and are uncorrelated for all and the least squares estimator and are unbiased and have minimum variance among all unbiased linear estimators. (A detailed proof of the Gauss-Markov Theorem can be found here)

Continue reading Assumptions of Classical Linear Regression Models (CLRM)# The Gauss Markov Theorem

When studying the classical linear regression model, one necessarily comes across the Gauss-Markov Theorem. The Gauss-Markov Theorem is a central theorem for linear regression models. It states different conditions that, when met, ensure that your estimator has the lowest variance among all unbiased estimators. More formally, Continue reading The Gauss Markov Theorem

# What is an indirect proof?

In economics, especially in theoretical economics, it is often necessary to formally prove your statements. Meaning to show your statements are correct in a logical way. One possible way of showing that your statements are correct is by providing an indirect proof. The following couple of lines try to explain the concept of indirect proof in a simple way.

# Derivation of the Least Squares Estimator for Beta in Matrix Notation

The following post is going to derive the least squares estimator for , which we will denote as . In general start by mathematically formalizing relationships we think are present in the real world and write it down in a formula.

Continue reading Derivation of the Least Squares Estimator for Beta in Matrix Notation

# Relationship between Coefficient of Determination & Squared Pearson Correlation Coefficient

The usual way of interpreting the coefficient of determination is to see it as the percentage of the variation of the dependent variable () can be explained by our model. The exact interpretation and derivation of the coefficient of determination can be found here.

Another way of interpreting the coefficient of determination is to look at it as the Squared Pearson Correlation Coefficient between the observed values and the fitted values Continue reading Relationship between Coefficient of Determination & Squared Pearson Correlation Coefficient

# The Coefficient Of Determination or R2

The coefficient of determination shows how much of the variation of the dependent variable () can be explained by our model. Another way of interpreting the coefficient of determination , which will not be discussed in this post, is to look at it as the squared Pearson correlation coefficient between the observed values and the fitted values . Why this is the case exactly can be found in another post.

# Balance Statistic

The following article tries to explain the *Balance Statistic* sometimes referred to as *Saldo* or *Saldo Statistic. *It is used as a quantification method for qualitative survey question. The benefit of applying the *Balance Statistic *arises when the survey is repeated over time as it tracks changes in respondents answers in a comprehensible way. The *Balance Statistic* is common in Business Tendency Surveys.