**Probem 1**

Let be the value of one roll of a fair die. If the value of the die is , we are given that has a binomial distribution with and (we use the notation ).

- Compute the mean and variance of .
- Compute the mean and variance of .
- Compute the covariance and the correlation coefficient .

**Probem 2**

Let be the value of one roll of a fair die. If the value of the die is , we are given that has a binomial distribution with and (we use the notation ).

- Compute the mean and variance of .
- Compute the mean and variance of .
- Compute the covariance and the correlation coefficient .

Problem 2 is left as exercise.

_________________________________________________________

**Discussion of Problem 1**

The joint variables and are identical to the ones in this previous post. However, we do not plan on following the approach in the previous, which is to first find the probability functions for the joint distribution and then the marginal distribution of . The calculation of covariance in Problem 1.3 can be very tedious by taking this approach.

**Problem 1.1**

We start with the easiest part, which is the random variable (the roll of the die). The variance is computed by .

**Problem 1.2**

We now compute the mean and variance of . The calculation of finding the joint distribution and then finding the marginal distribution of is tedious and has been done in this previous post. We do not take this approach here. Instead, we find the unconditional mean by weighting the conditional mean . The weights are the probabilities . The following is the idea.

We have for each . Before we do the weighting, we need to have some items about the conditional distribution . Since has a binomial distribution, we have:

For any random variable , and . The following is the second moment of , which is needed in calculating the unconditional variance .

We can now do the weighting to get the items of the variable .

**Problem 1.3**

The following is the definition of covariance of and :

where and .

The definition can be simplified as:

To compute , we can use the joint probability function of and to compute this expectation. But this is tedious. Anyone who wants to try can go to this previous post to obtain the joint distribution.

Note that the conditional mean is a linear function of . It is a well known result in probability and statistics that whenever a conditional mean is a linear function of , the conditional mean can be written as:

where is the mean of the respective variable, is the standard deviation of the respective variable and is the correlation coefficient. The following relates the correlation coefficient with the covariance.

Comparing and , we have and

Equating and , we have . Thus we deduce that is one-fourth of the variance of . Using , we have:

Plug in all the items of , , and into , we obtained . Both and are positive, an indication that both variables move together. When one increases, the other variable also increases. Thus makes sense based on the definition of the variables. For example, when the value of the die is large, the number of trials of is greater (hence a larger mean).

Tagged: Binomial Distribution, Conditional Distribution, Conditional Mean, Joint Distribution, Marginal Distribution, Probability, Probability and Statistics

## Leave a Reply