## Stochastic Independence versus Stochastic Dependence

Stochastic Independence versus Stochastic Dependence

In order to fully understand the Bayesian rule it is important to be familiar with some concepts of standard probability theory. Assume we have two events, let’s call them A and B. The probability that event A occurs is P(A) and the probability that event B occurs is P(B). If event A and event B are independent from each other, the probability that both events are occurring at the same time, also known as the joint probability $P(A\cap B)$.

In order for two events to be independent the occurrence of the event A cannot have any influence on the occurrence of the event B and vice versa. As an example let’s assume event A is defined as earning more than 10.000 per month and event B is defined as getting six when throwing a die. Both events are not, or at least not as far as I know, interfering. The probability of having six when throwing a die is not changing if it is a person who earns more than 10.000 per month, nor is the probability for a person earn more than 10.000 per month changing if this person has a six when throwing a die.

In Bayesian statistics such cases are less relevant and it becomes much more interesting when events are not independent. To create an example where to event are linearly dependent we simply have to redefine the event B from the example above. Event B is now not having a six, when throwing a die anymore, but event B is defined as being the CEO of a company with more than 10.000 employees. It should be intuitive that the probability of event A, earning more than 10.000 per month, is changing with the occurrence of event B. The probability of earning more than 10.000 should be higher when a person is a CEO of a company with more than 10.000 employees. On the other hand, the probability of event B, a person being a CEO of a company with more than 10.000 employees, should be higher when this person earns more than 10.000 per month.

The following two equations represent the cases where events are stochastically independent and stochastically dependent

$P(A\cap B)=P(A)P(B)$

$P(A\cap B)=P(A|B)P(B)$

Where $P(A|B)$ is the probability for event A under the condition the event B has occurred, we therefore refer to it as the conditional probability.