The binomial distribution is closely related to the Bernoulli distribution. In order to understand it better assume that are i.i.d (independent, identical distributed) variables following a Bernoulli distribution with and .

For a better understanding think that are outcomes of a coin flip, where 1 means head and 0 means tail. In this case

The probability function of the Binomial distribution looks the following:

**Properties of Bernoulli Distribution**

**a) Probability-generating function**

In order to compute the next transformation the binomial theorem which says:

Looking at this we see that in our expression above we already have . Now we take for $(1-\pi)^{n-x}$ and for we take . Given all these things we can proceed with the expression derived above and write:

By this we have our probability generating function given through

**b) Expected Value and Variance**

For computing the expected value and the variance we can use the probability generating function. The expected value is the first derivation of the probability generating function for the value one. The variance is defined through the second derivation from the probability generating function for value one plus the first derivation of the probability generating function at value one minus the squared first derivation of the probability generating function at value one. Putting the description in words we get

As seen, the computation of the expected value and the variance needs the first and second derivation of the probability generating function:

By plugging in 1 it follows that:

**c) Independency**

Assume X and Y are independently distributed with

Then the probability generating function is

Thereby is a probability generating function of a variable which is distributed as .

### Like this:

Like Loading...

*Related*

## One thought on “Binomial Distribution”