This is an excellent problem on the joint distribution of the random variables and where both variables are discrete. The focus is on calculation as well as the intuitive understanding of joint distributions.

Usually a joint distribution is defined by specifying the joint probability function. That is the joint distribution is defined by specifying for all possible values of and . The joint distribution presented here is defined by the distribution of (the value of a roll of a die) and the conditional distribution , which is declared to be a binomial distribution with and . From this definition, the joint probability function is derived. Once the joint probability function is known, the marginal distribution of by summing out the . The inverted conditional distribution is made possible by way of the Bayes’ theorem.

**Problem 1**

Let be the value of one roll of a fair die. If the value of the die is , we are given that has a binomial distribution with and (we use the notation to denote this binomial distribution).

- Compute the conditional binomial distributions where .
- Discuss how the joint probability function is computed for and .
- Compute the marginal probability function of and the mean and variance of .
- Compute for all applicable and .

Readers are encouraged to take out pencil and papers and work Problem 1. Problem 2 is found at the end of the post for additional practice.

A similar problem is also found in this post.

**Discussion of Problem 1**

This is an example of a joint distribution that is constructed from taking product of a conditional distribution and a marginial distribution. The marginal distribution of is a uniform distribution on the set (rolling a fair die). Conditional of , has a binomial distribution . Think of the conditional variable of as tossing a coin times where the probability of a head is . The following is the sample space of the joint distribution of and .

**Figure 1**

There are 27 points in Figure 1. Each column of points in Figure 1 is a binomial distribution conditional on that x-value. It will be helpful to first nail down the conditional distributions.

**Problem 1.1 – conditional binomial distributions**

The following shows the calculation of the binomial distributions.

**(1)**…..

**(2)**…..

**(3)**…..

**(4)**…..

**(5)**…..

**(6)**…..

**Problem 1.2 – joint probability function**

The joint probability function of and may be written as:

**(7)**…..

Thus the probability at each point in Figure 1 is the product of , which is , with the conditional probability , which is binomial. In other words, is derived from multiplying the binomial probability (calculated above) by 1/6. For example, the following diagram and equation demonstrate the calculation of

**Figure 2**

**(7a)**…..

**Problem 1.3 – marginal distribution**

To find the marginal probability , we need to sum over all to sum out the . For example, is the sum of for all .

**Figure 3**

As indicated in (7), each is the product of a conditional probability and . Thus the probability indicated in Figure 3 can be translated as:

**(8)**…..

We now begin the calculation.

**(9)**…..

**(10)**…..

**(11)**…..

**(12)**…..

**(13)**…..

**(14)**…..

**(15)**…..

The following is the calculation of the mean and variance of .

**(16)**…..

**(17)**…..

**(18)**…..

**Problem 1.4 – the backward conditional distribution**

The conditional probability is easy to compute since it is a given that is a binomial variable conditional on a value of . Now we want to find the backward probability . Given the binomial observation is , what is the probability that the roll of the die is ? This is an application of the Bayes’ theorem. We can start by looking at Figure 3 once more.

Consider . In calculating this conditional probability, we only consider the 5 sample points encircled in Figure 3 and disregard all the other points. These 5 points become a new sample space if you will (this is the essence of conditional probability and conditional distribution). The sum of the joint probability for these 5 points is , calculated in the previous step. The conditional probability is simply the probability of one of these 5 points as a fraction of the total probability . Thus we have:

**(19)**…..

We do not have to evaluate the components that go into (19). As a practical matter, to find is to take each of 5 probabilities shown in (11) and evaluate it as a fraction of the total probability . Thus we have:

**Calculation of **

**(20a)**…..

**(20b)**…..

**(20c)**…..

**(20d)**…..

**(20e)**…..

Here’s the rest of the Bayes’ calculation:

**Calculation of **

**(21a)**…..

**(21b)**…..

**(21c)**…..

**(21d)**…..

**(21e)**…..

**(21f)**…..

**Calculation of **

**(22a)**…..

**(22b)**…..

**(22c)**…..

**(22d)**…..

**(22e)**…..

**(22f)**…..

**Calculation of done earlier**

**Calculation of **

**(23a)**…..

**(23b)**…..

**(23c)**…..

**(23d)**…..

**Calculation of **

**(24a)**…..

**(24b)**…..

**(24c)**…..

**Calculation of **

**(25a)**…..

**(25b)**…..

**Calculation of **

**(26)**…..

**Problem 2**

Let be the value of one roll of a fair die. If the value of the die is , we are given that has a binomial distribution with and (we use the notation to denote this binomial distribution).

- Compute the conditional binomial distributions where .
- Discuss how the joint probability function is computed for and .
- Compute the marginal probability function of and the mean and variance of .
- Compute for all applicable and .

**Continuations**

The practice problems presented here are continued in the next post – calculating covariance and correlation coefficient.

A similar problem is also found in this post.

**Answers to Probem 2**

**Problem 2.3**

**Problem 2.4**

Dan Ma statistical

Daniel Ma statistical

Dan Ma practice problems

Daniel Ma practice problems

Daniel Ma mathematics

Dan Ma math

Daniel Ma probability

Dan Ma probability

Daniel Ma statistics

Dan Ma statistics

Dan Ma mathematical

Daniel Ma mathematical

2012-2019 – Dan Ma