Stochastic Independence versus Stochastic Dependence
In order to fully understand the Bayesian rule it is important to be familiar with some concepts of standard probability theory. Assume we have two events, let’s call them A and B. The probability that event A occurs is P(A) and the probability that event B occurs is P(B). If event A and event B are independent from each other, the probability that both events are occurring at the same time, also known as the joint probability .
Continue reading Stochastic Independence versus Stochastic Dependence
The binomial distribution is closely related to the Bernoulli distribution. In order to understand it better assume that are i.i.d (independent, identical distributed) variables following a Bernoulli distribution with and .
Continue reading Binomial Distribution
All cases in which manifestations have exactly two characteristics follow a Bernoulli distribution.
Typical examples are coin flip or medical treatment, which works or not.
Continue reading Bernoulli Distribution
Whatever Tex-document you are creating it has to contain three specifications. The first thing is to specify the class of the document, second thing is to define when the document begins and the third thing is to define when it ends.
Continue reading Necessary Conditions to Create a Latex File
A common problem to solve for various university courses are gambling games. In these games we are usually interested in the probability that we actually win. One of these game consists in throwing dice and the higher number wins, now a more complicated and from a statistical point of view more interesting game is where each player is throwing its die twice and the person which has the overall higher sum wins. In the following I am going through an example to calculate the winning probability step by step and in the end I attach an excel file were the game can be simulated using various kind of dice.
Continue reading Throwing The Dice
Proof of Unbiasness of Sample Variance Estimator
(As I received some remarks about the unnecessary length of this proof, I provide shorter version here)
In different application of statistics or econometrics but also in many other examples it is necessary to estimate the variance of a sample. The estimator of the variance, see equation (1) is normally common knowledge and most people simple apply it without any further concern. The question which arose for me was why do we actually divide by n-1 and not simply by n? In the following lines we are going to see the proof that the sample variance estimator is indeed unbiased. Continue reading Proof of Unbiasedness of Sample Variance Estimator
What exactly is happening when we linearize a model? Well, the answer is simple, we basically approximate non-linear equations with linear once. In context of macroeconomics we may have models which are non-linear. Thus in order to solve them there is need to put them in a linear form. In the following we are going to see how to log-linearizing a model by the means of a (very) simple example. Continue reading Log-Linearizing