In economics, especially in theoretical economics, it is often necessary to formally prove your statements. Meaning to show your statements are correct in a logical way. One possible way of showing that your statements are correct is by providing an indirect proof. The following couple of lines try to explain the concept of indirect proof in a simple way.
The following post is going to derive the least squares estimator for , which we will denote as . In general start by mathematically formalizing relationships we think are present in the real world and write it down in a formula.
The usual way of interpreting the coefficient of determination is to see it as the percentage of the variation of the dependent variable () can be explained by our model. The exact interpretation and derivation of the coefficient of determination can be found here.
Another way of interpreting the coefficient of determination is to look at it as the Squared Pearson Correlation Coefficient between the observed values and the fitted values Continue reading Relationship between Coefficient of Determination & Squared Pearson Correlation Coefficient
The coefficient of determination shows how much of the variation of the dependent variable () can be explained by our model. Another way of interpreting the coefficient of determination , which will not be discussed in this post, is to look at it as the squared Pearson correlation coefficient between the observed values and the fitted values . Why this is the case exactly can be found in another post.
The following article tries to explain the Balance Statistic sometimes referred to as Saldo or Saldo Statistic. It is used as a quantification method for qualitative survey question. The benefit of applying the Balance Statistic arises when the survey is repeated over time as it tracks changes in respondents answers in a comprehensible way. The Balance Statistic is common in Business Tendency Surveys.
Stochastic Independence versus Stochastic Dependence
In order to fully understand the Bayesian rule it is important to be familiar with some concepts of standard probability theory. Assume we have two events, let’s call them A and B. The probability that event A occurs is P(A) and the probability that event B occurs is P(B). If event A and event B are independent from each other, the probability that both events are occurring at the same time, also known as the joint probability .
All cases in which manifestations have exactly two characteristics follow a Bernoulli distribution.
Typical examples are coin flip or medical treatment, which works or not.
A common problem to solve for various university courses are gambling games. In these games we are usually interested in the probability that we actually win. One of these game consists in throwing dice and the higher number wins, now a more complicated and from a statistical point of view more interesting game is where each player is throwing its die twice and the person which has the overall higher sum wins. In the following I am going through an example to calculate the winning probability step by step and in the end I attach an excel file were the game can be simulated using various kind of dice.
Proof of Unbiasness of Sample Variance Estimator
(As I received some remarks about the unnecessary length of this proof, I provide shorter version here)
In different application of statistics or econometrics but also in many other examples it is necessary to estimate the variance of a sample. The estimator of the variance, see equation (1) is normally common knowledge and most people simple apply it without any further concern. The question which arose for me was why do we actually divide by n-1 and not simply by n? In the following lines we are going to see the proof that the sample variance estimator is indeed unbiased.Continue reading Proof of Unbiasedness of Sample Variance Estimator