Monthly Archives: January 2012

An Example on Calculating Covariance

Probem 1
Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{4} (we use the notation Y \lvert X=x \sim \text{binom}(x,\frac{1}{4})).

  1. Compute the mean and variance of X.
  2. Compute the mean and variance of Y.
  3. Compute the covariance Cov(X,Y) and the correlation coefficient \rho.

Probem 2
Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{2} (we use the notation Y \lvert X=x \sim \text{binom}(x,\frac{1}{2})).

  1. Compute the mean and variance of X.
  2. Compute the mean and variance of Y.
  3. Compute the covariance Cov(X,Y) and the correlation coefficient \rho.

Problem 2 is left as exercise.

_________________________________________________________
Discussion of Problem 1

The joint variables X and Y are identical to the ones in this previous post. However, we do not plan on following the approach in the previous, which is to first find the probability functions for the joint distribution and then the marginal distribution of Y. The calculation of covariance in Problem 1.3 can be very tedious by taking this approach.

Problem 1.1
We start with the easiest part, which is the random variable X (the roll of the die). The variance is computed by Var(X)=E(X^2)-E(X)^2.

\displaystyle (1) \ \ \ \ \ E(X)=\frac{1}{6} \biggl[1+2+3+4+5+6 \biggr]=\frac{21}{6}=3.5

\displaystyle (2) \ \ \ \ \ E(X^2)=\frac{1}{6} \biggl[1^2+2^2+3^2+4^2+5^2+6^2 \biggr]=\frac{91}{6}

\displaystyle (3) \ \ \ \ \ Var(X)=\frac{91}{6}-\biggl[\frac{21}{6}\biggr]^2=\frac{105}{36}=\frac{35}{12}

Problem 1.2

We now compute the mean and variance of Y. The calculation of finding the joint distribution and then finding the marginal distribution of Y is tedious and has been done in this previous post. We do not take this approach here. Instead, we find the unconditional mean E(Y) by weighting the conditional mean E(Y \lvert X=x). The weights are the probabilities P(X=x). The following is the idea.

\displaystyle \begin{aligned}(4) \ \ \ \ \  E(Y)&=E_X[E(Y \lvert X=x)] \\&= E(Y \lvert X=1) \times P(X=1) \\&+ E(Y \lvert X=2) \times P(X=2)\\&+ E(Y \lvert X=3)  \times P(X=3) \\&+ E(Y \lvert X=4)  \times P(X=4) \\&+E(Y \lvert X=5)  \times P(X=5) \\&+E(Y \lvert X=6)  \times P(X=6) \end{aligned}

We have P(X=x)=\frac{1}{6} for each x. Before we do the weighting, we need to have some items about the conditional distribution Y \lvert X=x. Since Y \lvert X=x has a binomial distribution, we have:

\displaystyle (5) \ \ \ \ \ E(Y \lvert X=x)=\frac{1}{4} \ x

\displaystyle (6) \ \ \ \ \ Var(Y \lvert X=x)=\frac{1}{4} \ \frac{3}{4} \ x=\frac{3}{16} \ x

For any random variable W, Var(W)=E(W^2)-E(W)^2 and E(W^2)=Var(W)+E(W)^2. The following is the second moment of Y \lvert X=x, which is needed in calculating the unconditional variance Var(Y).

\displaystyle \begin{aligned}(7) \ \ \ \ \ E(Y^2 \lvert X=x)&=\frac{3}{16} \ x+\biggl[\frac{1}{4} \ x \biggr]^2 \\&=\frac{3x}{16}+\frac{x^2}{16} \\&=\frac{3x+x^2}{16}  \end{aligned}

We can now do the weighting to get the items of the variable Y.

\displaystyle \begin{aligned}(8) \ \ \ \ \  E(Y)&=\frac{1}{6} \biggl[\frac{1}{4} +\frac{2}{4}+\frac{3}{4}+ \frac{4}{4}+\frac{5}{4}+\frac{6}{4}\biggr] \\&=\frac{7}{8} \\&=0.875  \end{aligned}

\displaystyle \begin{aligned}(9) \ \ \ \ \  E(Y^2)&=\frac{1}{6} \biggl[\frac{3(1)+1^2}{16} +\frac{3(2)+2^2}{16}+\frac{3(3)+3^2}{16} \\&+ \frac{3(4)+4^2}{16}+\frac{3(5)+5^2}{16}+\frac{3(6)+6^2}{16}\biggr] \\&=\frac{154}{96} \\&=\frac{77}{48}  \end{aligned}

\displaystyle \begin{aligned}(10) \ \ \ \ \  Var(Y)&=E(Y^2)-E(Y)^2 \\&=\frac{77}{48}-\biggl[\frac{7}{8}\biggr]^2 \\&=\frac{161}{192} \\&=0.8385 \end{aligned}

Problem 1.3

The following is the definition of covariance of X and Y:

\displaystyle (11) \ \ \ \ \ Cov(X,Y)=E[(X-\mu_X)(Y-\mu_Y)]

where \mu_X=E(X) and \mu_Y=E(Y).

The definition (11) can be simplified as:

\displaystyle (12) \ \ \ \ \ Cov(X,Y)=E[XY]-E[X] E[Y]

To compute E[XY], we can use the joint probability function of X and Y to compute this expectation. But this is tedious. Anyone who wants to try can go to this previous post to obtain the joint distribution.

Note that the conditional mean E(Y \lvert X=x)=\frac{x}{4} is a linear function of x. It is a well known result in probability and statistics that whenever a conditional mean E(Y \lvert X=x) is a linear function of x, the conditional mean can be written as:

\displaystyle (13) \ \ \ \ \ E(Y \lvert X=x)=\mu_Y+\rho \ \frac{\sigma_Y}{\sigma_X} \ (x-\mu_X)

where \mu is the mean of the respective variable, \sigma is the standard deviation of the respective variable and \rho is the correlation coefficient. The following relates the correlation coefficient with the covariance.

\displaystyle (14) \ \ \ \ \ \rho=\frac{Cov(X,Y)}{\sigma_X \ \sigma_Y}

Comparing (5) and (13), we have \displaystyle \rho \frac{\sigma_Y}{\sigma_X}=\frac{1}{4} and

\displaystyle (15) \ \ \ \ \  \rho = \frac{\sigma_X}{4 \ \sigma_Y}

Equating (14) and (15), we have Cov(X,Y)=\frac{\sigma_X^2}{4}. Thus we deduce that Cov(X,Y) is one-fourth of the variance of X. Using (3), we have:

\displaystyle (16) \ \ \ \ \  Cov(X,Y) = \frac{1}{4} \times \frac{35}{12}=\frac{35}{48}=0.72917

Plug in all the items of (3), (10), and (16) into (14), we obtained \rho=0.46625. Both \rho and Cov(X,Y) are positive, an indication that both variables move together. When one increases, the other variable also increases. Thus makes sense based on the definition of the variables. For example, when the value of the die is large, the number of trials of Y is greater (hence a larger mean).

Advertisements

An Example of a Joint Distribution

Probem 1
Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{4} (we use the notation Y \lvert X=x \sim \text{binom}(x,\frac{1}{4})).

  1. Discuss how the joint probability function P[X=x,Y=y] is computed for x=1,2,3,4,5,6 and y=0,1, \cdots, x.
  2. Compute the conditional binomial distributions Y \lvert X=x where x=1,2,3,4,5,6.
  3. Compute the marginal probability function of Y and the mean and variance of Y.
  4. Compute P(X=x \lvert Y=y) for all applicable x and y.

_____________________________________________________________
Discussion of Problem 1

Problem 2 is found at the end of the post.

Problem 1.1
This is an example of a joint distribution that is constructed from taking product of conditional distributions and a marginial distribution. The marginal distribution of X is a uniform distribution on the set \left\{1,2,3,4,5,6 \right\} (rolling a fiar die). Conditional of X=x, Y has a binomial distribution \text{binom}(x,\frac{1}{4}). Think of the conditional variable of Y \lvert X=x as tossing a coin x times where the probability of a head is p=\frac{1}{4}. The following is the sample space of the joint distribution of X and Y.

Figure 1

The joint probability function of X and Y may be written as:

\displaystyle (1) \ \ \ \ \ P(X=x,Y=y)=P(Y=y \lvert X=x) \times P(X=x)

Thus the probability at each point in Figure 1 is the product of P(X=x), which is \frac{1}{6}, with the conditional probability P(Y=y \lvert X=x), which is binomial. For example, the following diagram and equation demonstrate the calculation of P(X=4,Y=3)

Figure 2

\displaystyle \begin{aligned}(1a) \ \ \ \ \ P(X=4,Y=3)&=P(Y=3 \lvert X=4) \times P(X=4) \\&=\binom{4}{3} \biggl[\frac{1}{4}\biggr]^3 \biggl[\frac{3}{4}\biggr]^1 \times \frac{1}{6} \\&=\frac{12}{256}  \end{aligned}

Problem 1.2
The following shows the calculation of the binomial distributions.

\displaystyle \begin{aligned} (2) \ \ \ Y \lvert X=1 \ \ \ \ \ &P(Y=0 \lvert X=1)=\frac{3}{4} \\&P(Y=1 \lvert X=1)=\frac{1}{4} \end{aligned}

\displaystyle \begin{aligned} (3) \ \ \ Y \lvert X=2 \ \ \ \ \ &P(Y=0 \lvert X=2)=\binom{2}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^2=\frac{9}{16} \\&P(Y=1 \lvert X=2)=\binom{2}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^1=\frac{6}{16} \\&P(Y=2 \lvert X=2)=\binom{2}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{16} \end{aligned}

\displaystyle \begin{aligned} (4) \ \ \ Y \lvert X=3 \ \ \ \ \ &P(Y=0 \lvert X=3)=\binom{3}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^3=\frac{27}{64} \\&P(Y=1 \lvert X=3)=\binom{3}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^2=\frac{27}{64} \\&P(Y=2 \lvert X=3)=\binom{3}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^1=\frac{9}{64} \\&P(Y=3 \lvert X=3)=\binom{3}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{64} \end{aligned}

\displaystyle \begin{aligned} (5) \ \ \ Y \lvert X=4 \ \ \ \ \ &P(Y=0 \lvert X=4)=\binom{4}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^4=\frac{81}{256} \\&P(Y=1 \lvert X=4)=\binom{4}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^3=\frac{108}{256} \\&P(Y=2 \lvert X=4)=\binom{4}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^2=\frac{54}{256} \\&P(Y=3 \lvert X=4)=\binom{4}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^1=\frac{12}{256} \\&P(Y=4 \lvert X=4)=\binom{4}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{256} \end{aligned}

\displaystyle \begin{aligned} (6) \ \ \ Y \lvert X=5 \ \ \ \ \ &P(Y=0 \lvert X=5)=\binom{5}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^5=\frac{243}{1024} \\&P(Y=1 \lvert X=5)=\binom{5}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^4=\frac{405}{1024} \\&P(Y=2 \lvert X=5)=\binom{5}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^3=\frac{270}{1024} \\&P(Y=3 \lvert X=5)=\binom{5}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^2=\frac{90}{1024} \\&P(Y=4 \lvert X=5)=\binom{5}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^1=\frac{15}{1024} \\&P(Y=5 \lvert X=5)=\binom{5}{5} \biggl(\frac{1}{4}\biggr)^5 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{1024} \end{aligned}

\displaystyle \begin{aligned} (7) \ \ \ Y \lvert X=6 \ \ \ \ \ &P(Y=0 \lvert X=6)=\binom{6}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^6=\frac{729}{4096} \\&P(Y=1 \lvert X=6)=\binom{6}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^5=\frac{1458}{4096} \\&P(Y=2 \lvert X=6)=\binom{6}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^4=\frac{1215}{4096} \\&P(Y=3 \lvert X=6)=\binom{6}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^3=\frac{540}{4096} \\&P(Y=4 \lvert X=6)=\binom{6}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^2=\frac{135}{4096} \\&P(Y=5 \lvert X=6)=\binom{6}{5} \biggl(\frac{1}{4}\biggr)^5 \biggl(\frac{3}{4}\biggr)^1=\frac{18}{4096} \\&P(Y=6 \lvert X=6)=\binom{6}{6} \biggl(\frac{1}{4}\biggr)^6 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{4096} \end{aligned}

Problem 1.3
To find the marginal probability P(Y=y), we need to sum P(X=x,Y=y) over all x. For example, P(Y=2) is the sum of P(X=x,Y=2) for all x=2,3,4,5,6. See the following diagram

Figure 3

As indicated in (1), each P(X=x,Y=2) is the product of a conditional probability P(Y=y \lvert X=x) and P(X=x)=\frac{1}{6}. Thus the probability indicated in Figure 3 can be translated as:

\displaystyle \begin{aligned}(8) \ \ \ \ \ P(Y=2)&=\sum \limits_{x=2}^6 P(Y=2 \lvert X=x) P(X=x)  \end{aligned}

We now begin the calculation.

\displaystyle \begin{aligned}(9) \ \ \ \ \ P(Y=0)&=\sum \limits_{x=1}^6 P(Y=0 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{3}{4}+\frac{9}{16}+\frac{27}{64} \\&+ \ \ \ \frac{81}{256}+\frac{243}{1024}+\frac{729}{4096} \biggr] \\&=\frac{10101}{24576} \end{aligned}

\displaystyle \begin{aligned}(10) \ \ \ \ \ P(Y=1)&=\sum \limits_{x=1}^6 P(Y=1 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{4}+\frac{6}{16}+\frac{27}{64} \\&+ \ \ \ \frac{108}{256}+\frac{405}{1024}+\frac{1458}{4096} \biggr] \\&=\frac{9094}{24576} \end{aligned}

\displaystyle \begin{aligned}(11) \ \ \ \ \ P(Y=2)&=\sum \limits_{x=2}^6 P(Y=2 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{16}+\frac{9}{64} \\&+ \ \ \ \frac{54}{256}+\frac{270}{1024}+\frac{1215}{4096} \biggr] \\&=\frac{3991}{24576} \end{aligned}

\displaystyle \begin{aligned}(12) \ \ \ \ \ P(Y=3)&=\sum \limits_{x=3}^6 P(Y=3 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{64} \\&+ \ \ \ \frac{12}{256}+\frac{90}{1024}+\frac{540}{4096} \biggr] \\&=\frac{1156}{24576} \end{aligned}

\displaystyle \begin{aligned}(13) \ \ \ \ \ P(Y=4)&=\sum \limits_{x=4}^6 P(Y=4 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{256}+\frac{15}{1024}+\frac{135}{4096} \biggr] \\&=\frac{211}{24576} \end{aligned}

\displaystyle \begin{aligned}(14) \ \ \ \ \ P(Y=5)&=\sum \limits_{x=5}^6 P(Y=5 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{1024}+\frac{18}{4096} \biggr] \\&=\frac{22}{24576} \end{aligned}

\displaystyle \begin{aligned}(15) \ \ \ \ \ P(Y=6)&=\sum \limits_{x=6}^6 P(Y=6 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{4096} \biggr] \\&=\frac{1}{24576} \end{aligned}

The following is the calculation of the mean and variance of Y.

\displaystyle \begin{aligned}(16) \ \ \ \ \ E(Y)&=\frac{10101}{24576} \times 0+\frac{9094}{24576} \times 1+\frac{3991}{24576} \times 2  \\&+ \ \ \ \ \frac{1156}{24576} \times 3+\frac{211}{24576} \times 4+\frac{22}{24576} \times 5 \\&+ \ \ \ \ \frac{1}{24576} \times 6  \\&=\frac{21504}{24576}\\&=0.875 \end{aligned}

\displaystyle \begin{aligned}(17) \ \ \ \ \ E(Y^2)&=\frac{10101}{24576} \times 0+\frac{9094}{24576} \times 1+\frac{3991}{24576} \times 2^2  \\&+ \ \ \ \ \frac{1156}{24576} \times 3^2+\frac{211}{24576} \times 4^2+\frac{22}{24576} \times 5^2 \\&+ \ \ \ \ \frac{1}{24576} \times 6^2  \\&=\frac{39424}{24576}\\&=\frac{77}{48} \end{aligned}

\displaystyle (18) \ \ \ \ \ Var(Y)=\frac{77}{48}-0.875^2=\frac{161}{192}=0.8385

Problem 1.4
The conditional probability P(Y=y \lvert X=x) is easy to compute since it is a given that Y is a binomial variable conditional on a value of X. Now we want to find the backward probability P(X= x \lvert Y=y). Given the binomial observation is Y=y, what is the probability that the roll of the die is X=x? This is an application of the Bayes’ theorem. We can start by looking at Figure 3 once more.

Consider P(X=x \lvert Y=2). In calculating this conditional probability, we only consider the 5 sample points encircled in Figure 3 and disregard all the other points. These 5 points become a new sample space if you will (this is the essence of conditional probability and conditional distribution). The sum of the joint probability P(X=x,Y=y) for these 5 points is P(Y=2), calculated in the previous step. The conditional probability P(X=x \lvert Y=2) is simply the probability of one of these 5 points as a fraction of the total probability P(Y=2). Thus we have:

\displaystyle \begin{aligned}(19) \ \ \ \ \ P(X=x \lvert Y=2)&=\frac{P(X=x,Y=2)}{P(Y=2)} \end{aligned}

We do not have to evaluate the components that go into (19). As a practical matter, to find P(X=x \lvert Y=2) is to take each of 5 probabilities shown in (11) and evaluate it as a fraction of the total probability P(Y=2). Thus we have:

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 2 \bold )
\displaystyle \begin{aligned}(20a) \ \ \ \ \ P(X=2 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{16}}{\displaystyle \frac{3991}{24576}} =\frac{256}{3991} \end{aligned}

\displaystyle \begin{aligned}(20b) \ \ \ \ \ P(X=3 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{9}{64}}{\displaystyle \frac{3991}{24576}} =\frac{576}{3991} \end{aligned}

\displaystyle \begin{aligned}(20c) \ \ \ \ \ P(X=4 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{54}{256}}{\displaystyle \frac{3991}{24576}} =\frac{864}{3991} \end{aligned}

\displaystyle \begin{aligned}(20d) \ \ \ \ \ P(X=5 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{270}{1024}}{\displaystyle \frac{3991}{24576}} =\frac{1080}{3991} \end{aligned}

\displaystyle \begin{aligned}(20e) \ \ \ \ \ P(X=6 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{1215}{4096}}{\displaystyle \frac{3991}{24576}} =\frac{1215}{3991} \end{aligned}

Here’s the rest of the Bayes’ calculation:

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 0 \bold )
\displaystyle \begin{aligned}(21a) \ \ \ \ \ P(X=1 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{3}{4}}{\displaystyle \frac{10101}{24576}} =\frac{3072}{10101} \end{aligned}

\displaystyle \begin{aligned}(21b) \ \ \ \ \ P(X=2 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{9}{16}}{\displaystyle \frac{10101}{24576}} =\frac{2304}{10101} \end{aligned}

\displaystyle \begin{aligned}(21c) \ \ \ \ \ P(X=3 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{27}{64}}{\displaystyle \frac{10101}{24576}} =\frac{1728}{10101} \end{aligned}

\displaystyle \begin{aligned}(21d) \ \ \ \ \ P(X=4 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{81}{256}}{\displaystyle \frac{10101}{24576}} =\frac{1296}{10101} \end{aligned}

\displaystyle \begin{aligned}(21e) \ \ \ \ \ P(X=5 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{243}{1024}}{\displaystyle \frac{10101}{24576}} =\frac{972}{10101} \end{aligned}

\displaystyle \begin{aligned}(21f) \ \ \ \ \ P(X=6 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{729}{4096}}{\displaystyle \frac{10101}{24576}} =\frac{3729}{10101} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 1 \bold )
\displaystyle \begin{aligned}(22a) \ \ \ \ \ P(X=1 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{4}}{\displaystyle \frac{9094}{24576}} =\frac{1024}{9094} \end{aligned}

\displaystyle \begin{aligned}(22b) \ \ \ \ \ P(X=2 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{6}{16}}{\displaystyle \frac{9094}{24576}} =\frac{1536}{9094} \end{aligned}

\displaystyle \begin{aligned}(22c) \ \ \ \ \ P(X=3 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{27}{64}}{\displaystyle \frac{9094}{24576}} =\frac{1728}{9094} \end{aligned}

\displaystyle \begin{aligned}(22d) \ \ \ \ \ P(X=4 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{108}{256}}{\displaystyle \frac{9094}{24576}} =\frac{1728}{9094} \end{aligned}

\displaystyle \begin{aligned}(22e) \ \ \ \ \ P(X=5 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{405}{1024}}{\displaystyle \frac{9094}{24576}} =\frac{1620}{9094} \end{aligned}

\displaystyle \begin{aligned}(22f) \ \ \ \ \ P(X=6 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{1458}{4096}}{\displaystyle \frac{9094}{24576}} =\frac{1458}{9094} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 2 \bold ) done earlier

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 3 \bold )
\displaystyle \begin{aligned}(23a) \ \ \ \ \ P(X=3 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{64}}{\displaystyle \frac{1156}{24576}} =\frac{64}{1156} \end{aligned}

\displaystyle \begin{aligned}(23b) \ \ \ \ \ P(X=4 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{12}{256}}{\displaystyle \frac{1156}{24576}} =\frac{192}{1156} \end{aligned}

\displaystyle \begin{aligned}(23c) \ \ \ \ \ P(X=5 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{90}{1024}}{\displaystyle \frac{1156}{24576}} =\frac{360}{1156} \end{aligned}

\displaystyle \begin{aligned}(23d) \ \ \ \ \ P(X=6 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{540}{4096}}{\displaystyle \frac{1156}{24576}} =\frac{540}{1156} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 4 \bold )
\displaystyle \begin{aligned}(24a) \ \ \ \ \ P(X=4 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{256}}{\displaystyle \frac{211}{24576}} =\frac{16}{211} \end{aligned}

\displaystyle \begin{aligned}(24b) \ \ \ \ \ P(X=5 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{15}{1024}}{\displaystyle \frac{211}{24576}} =\frac{60}{211} \end{aligned}

\displaystyle \begin{aligned}(24c) \ \ \ \ \ P(X=6 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{135}{4096}}{\displaystyle \frac{211}{24576}} =\frac{135}{211} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 5 \bold )
\displaystyle \begin{aligned}(25a) \ \ \ \ \ P(X=5 \lvert Y=5)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{1024}}{\displaystyle \frac{22}{24576}} =\frac{4}{22} \end{aligned}

\displaystyle \begin{aligned}(25b) \ \ \ \ \ P(X=6 \lvert Y=5)&=\frac{\displaystyle \frac{1}{6} \times \frac{18}{1024}}{\displaystyle \frac{22}{24576}} =\frac{18}{22} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 6 \bold )
\displaystyle \begin{aligned}(26) \ \ \ \ \ P(X=6 \lvert Y=6)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{4096}}{\displaystyle \frac{1}{24576}} =1 \end{aligned}

_____________________________________________________________
Probem 2
Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{2} (we use the notation Y \lvert X=x \sim \text{binom}(x,\frac{1}{2})).

  1. Discuss how the joint probability function P[X=x,Y=y] is computed for x=1,2,3,4,5,6 and y=0,1, \cdots, x.
  2. Compute the conditional binomial distributions Y \lvert X=x where x=1,2,3,4,5,6.
  3. Compute the marginal probability function of Y and the mean and variance of Y.
  4. Compute P(X=x \lvert Y=y) for all applicable x and y.

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

_____________________________________________________________
Answers to Probem 2

Problem 2.3

\displaystyle \begin{aligned} P(Y=y): \ \ \ \ &P(Y=0)=\frac{63}{384} \\&\text{ }  \\&P(Y=1)=\frac{120}{384} \\&\text{ } \\&P(Y=2)=\frac{99}{384} \\&\text{ } \\&P(Y=3)=\frac{64}{384} \\&\text{ } \\&P(Y=4)=\frac{29}{384} \\&\text{ } \\&P(Y=5)=\frac{8}{384} \\&\text{ } \\&P(Y=6)=\frac{1}{384} \end{aligned}

\displaystyle E(Y)=\frac{7}{4}=1.75

\displaystyle Var(Y)=\frac{77}{48}

Problem 2.4

\displaystyle \begin{aligned} P(X=x \lvert Y=0): \ \ \ \ &P(X=1 \lvert Y=0)=\frac{32}{63} \\&\text{ }  \\&P(X=2 \lvert Y=0)=\frac{16}{63} \\&\text{ } \\&P(X=3 \lvert Y=0)=\frac{8}{63} \\&\text{ } \\&P(X=4 \lvert Y=0)=\frac{4}{63} \\&\text{ } \\&P(X=5 \lvert Y=0)=\frac{2}{63} \\&\text{ } \\&P(X=6 \lvert Y=0)=\frac{1}{63}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=1): \ \ \ \ &P(X=1 \lvert Y=1)=\frac{32}{120} \\&\text{ }  \\&P(X=2 \lvert Y=1)=\frac{32}{120} \\&\text{ } \\&P(X=3 \lvert Y=1)=\frac{24}{120} \\&\text{ } \\&P(X=4 \lvert Y=1)=\frac{16}{120} \\&\text{ } \\&P(X=5 \lvert Y=1)=\frac{10}{120} \\&\text{ } \\&P(X=6 \lvert Y=1)=\frac{6}{120}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=2): \ \ \ \ &P(X=2 \lvert Y=2)=\frac{16}{99} \\&\text{ } \\&P(X=3 \lvert Y=2)=\frac{24}{99} \\&\text{ } \\&P(X=4 \lvert Y=2)=\frac{24}{99} \\&\text{ } \\&P(X=5 \lvert Y=2)=\frac{20}{99} \\&\text{ } \\&P(X=6 \lvert Y=2)=\frac{15}{99}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=3): \ \ \ \ &P(X=3 \lvert Y=3)=\frac{8}{64} \\&\text{ } \\&P(X=4 \lvert Y=3)=\frac{16}{64} \\&\text{ } \\&P(X=5 \lvert Y=3)=\frac{20}{64} \\&\text{ } \\&P(X=6 \lvert Y=3)=\frac{20}{64}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=4): \ \ \ \ &P(X=4 \lvert Y=4)=\frac{4}{29} \\&\text{ } \\&P(X=5 \lvert Y=4)=\frac{10}{29} \\&\text{ } \\&P(X=6 \lvert Y=4)=\frac{15}{29}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=5): \ \ \ \ &P(X=5 \lvert Y=5)=\frac{2}{8} \\&\text{ } \\&P(X=6 \lvert Y=5)=\frac{6}{8}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=6): \ \ \ \ &P(X=6 \lvert Y=6)=1  \end{aligned}

How to Pick Binomial Trials

This post provides additional practice for the ideas discussed in this blog post Picking Two Types of Binomial Trials.

Problem 1
Suppose there are two basketball players, each makes 50% of her free throws. In one game, player A attempted 10 free throws and player B attempted 15 free throws. Assume that the free throws of each player are independent of each other. Suppose you are told that in this game, 8 of their free throws were hits. Given this information:

  1. What is the probability that player A made 4 of the hits?
  2. What is the mean number of hits made by player A?
  3. What is the variance of the number of hits made by player A?
  4. What is the probability that player B made 5 of the hits?
  5. What is the mean number of hits made by player B?
  6. What is the variance of the number of hits made by player B?

Problem 2
A student took two multiple choice statistics quizzes that were independent of each other, i.e., results of one quiz did not affect the results on the other. One quiz had 8 questions and the other quiz had 10 questions. Each question had 5 choices and only one of the choices was correct. The student did not study. So she answered each question by random guessing. If the student was told that she had 5 correct answers in the two quizzes:

  1. What is the probability that the student answered 3 or more questions correctly in the first quiz?
  2. What is the mean number of correct answers in the first quiz?
  3. What is the variance of the number of correct answers in the first quiz?
  4. What is the probability that the student answered at most 3 questions correctly in the second quiz?
  5. What is the mean number of correct answers in the second quiz?
  6. What is the variance of the number of correct answers in the second quiz?

Refer to this post to find the background information.

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

__________________________________________________________
Answers

Problem 1

\displaystyle (1) \ \ \ \ \frac{286650}{1081575}=0.265

\displaystyle (2) \ \ \ \ \frac{3461040}{1081575}=3.2

(3) \ \ \ \ 1.36

\displaystyle (4) \ \ \ \ \frac{360360}{1081575}=0.33318

(5) \ \ \ \ 4.8

(6) \ \ \ \ 1.36

Problem 2

\displaystyle (1) \ \ \ \ \frac{3276}{8568}=0.3824

\displaystyle (2) \ \ \ \ \frac{19040}{8568}=2.22

\displaystyle (3) \ \ \ \ \frac{1300}{1377}=0.944

\displaystyle (4) \ \ \ \ \frac{6636}{8568}=0.7745

\displaystyle (5) \ \ \ \ \frac{23800}{8568}=2.78

\displaystyle (6) \ \ \ \ \frac{1300}{1377}=0.944

Mixing Binomial Distributions

Consider the following problems. We work Problem 1. Problem 2 is left as exercise.

Problem 1
Suppose that the following is the probability function of the random variable X.

\displaystyle (1) \ \ \ \ \ P(X=x)=0.5 \binom{4}{x} \biggl[\frac{1}{5}\biggr]^x \biggl[\frac{4}{5}\biggr]^{4-x}+0.5 \binom{4}{x} \biggl[\frac{4}{5}\biggr]^x \biggl[\frac{1}{5}\biggr]^{4-x}

where x=0,1,2,3,4.

Evaluate the mean and variance of X.

_____________________________________________________________
Problem 2
Suppose that the following is the probability function of the random variable X.

\displaystyle (2) \ \ \ \ \ P(X=x)=0.6 \times f_1(x)+0.3 \times f_2(x)+0.1 \times f_3(x)

where f_1(x) is the probability function of the binomial distribution with n=5 trials and probability of success p=0.1, f_2(x) is the probability function of the binomial distribution with n=5 trials and probability of success p=0.4, and f_3(x) is the probability function of the binomial distribution with n=5 trials and probability of success p=0.6 .

Evaluate the mean and variance of X.

Answers for Problem 2 are found at the end of the post.
_____________________________________________________________
Discussion of Problem 1
The probability function (1) is the weighted average of two binomial probability functions. The weights are 0.5 and 0.5. Thus the probability distribution of X is said to be the mixture of two binomial distributions with equal mixing weights.

One interpretation of the probability function (1) is that the underlying phenomenon can be one of two phenomena. As an actuarial science example, suppose that a block of insurance policies is divided into two groups, roughly equal in size. One group is a low risk group. It has a low claim frequency (the probability of a policyholder in this group having a claim in a given year is 0.2=\frac{1}{5}). The other group is a high risk group. It has a high claim frequency (the probability of a policyholder having a claim is 0.8=\frac{4}{5}). Suppose you pick a policyholder at random from this block. What is the expected number of claims in a year from this randomly chosen insured? What is the variance of the number of claims?

Since the probability function (1) is a weighted average of binomial distributions, the mean and other higher moments are the weighted average of the binomial means and higher moments. However, as you will see below, the variance of the mixture is not the weighted average of the binomial variances.

Before we work the problem, we need some preliminary facts about binomial distributions. Suppose Y has a binomial distribution with parameters n and p. This fact is denoted by the notation Y \sim \text{binom}(n,p). Then the mean and variance of Y are E(Y)=n p and Var(Y)=n p (1-p), respectively. Since Var(Y)=E(Y^2)-E(Y)^2, it follows that:

\displaystyle \begin{aligned}(3) \ \ \ \ \ E(Y^2)&=Var(Y) + E(Y)^2 \\&=n p (1-p) + (n p)^2  \end{aligned}

Now the calculation:

\displaystyle (4) \ \ \ \ \ E(X)=0.5 \biggl(4 \cdot \frac{1}{5}\biggr) + 0.5 \biggl(4 \cdot \frac{4}{5}\biggr)=2

\displaystyle \begin{aligned}(5) \ \ \ \ \ E(X^2)&=0.5 \biggl[4 \cdot \frac{1}{5} \cdot \frac{4}{5} + \biggl(4 \cdot \frac{1}{5} \biggr)^2 \biggr]\\&+ \ \ \ \ \ 0.5 \biggl[4 \cdot \frac{4}{5} \cdot \frac{1}{5}+\biggl(4 \cdot \frac{4}{5} \biggr)^2 \biggr] \\&=\frac{152}{25}=6.08  \end{aligned}

The idea for (4) and (5) is that the mean and the second moment of X are the weighted average of the means and second moments of the binomial distributions. The following is the variance of X

\displaystyle \begin{aligned}(6) \ \ \ \ \ Var(X)&=\frac{152}{25}-2^2=\frac{52}{25}=2.08  \end{aligned}

We use the insurance example indicated earlier to interpret the unconditional variance in (6). The two binomial distributions in the probability function (1) are conditional distributions (e.g. conditional on what group of insureds the randomly chosen policyholder comes from, high risk or low risk). To put the result (6) into perspective, note that both of these binomial distributions have the same variance, i.e., 4 \cdot \frac{1}{5} \cdot \frac{4}{5}=0.64. Yet the unconditional variance var(X) is much higher than 0.64. The additional variance 2.08-0.64=1.44 is the additional variance due to the uncertainty in the risk parameter of the insured (the uncertainty of what group the randomly chosen policyholder comes from).

The two binomial distributions in the probability function (1) are conditional distributions indexed by a parameter variable that is implicit in (1). For example, when the randomly chosen policyholder is from the low risk group, the parameter is \theta=1 and the number of claims follows \text{binom}(4,\frac{1}{5}). When the randomly chosen policyholder is from the high risk group, the parameter is \theta=2 and the number of claims follows \text{binom}(4,\frac{4}{5}). The uncertainty in the risk parameter \theta has the effect of increasing the unconditional variance of the mixture.

The increase in variance is a key characteristic of mixture distributions. Whenever a probability distribution is the mixture of conditional distributions, the uncertainty in the parameter variable always has the effect of increasing the unconditional variance of the mixture. In the insurance example, the uncertainty of the risk characteristics of the insureds across the entire block is reflected in the higher unconditional variance (as demonstrated in Problem 1).

_____________________________________________________________
See the following blog posts for more detailed discussion of mixture distributions.

An example of a mixture

The variance of a mixture

_____________________________________________________________
Answers for Problem 2
E(X)=1.2
Var(X)=1.56

Two Practice Problems on the Standard Normal Distribution

This post presents two practice problems with calculation involving the standard normal distribution.

Problems
Let Z be a standard normal random variable.

  1. Evaluate \displaystyle E(\lvert Z \lvert)
  2. Evaluate \displaystyle E(Z^2)

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

We show that \displaystyle E(\lvert Z \lvert)=\sqrt{\frac{2}{\pi}}. Problem 2 is left as an exercise.

Let X=\lvert Z \lvert. The cumulative distribution function of X is F(x)=P(X \le x). We have the following.

\displaystyle \begin{aligned}(1) \ \ \ \ \  F(x)&= P(X \le x) \\&\text{ } \\&= P(\lvert Z \lvert \le x) \\&\text{ } \\&=P(-x \le Z \le x) \\&\text{ } \\&=\int_{-x}^x \frac{1}{\sqrt{2 \pi}} \ e^{-\frac{t^2}{2}} \ dt \\&\text{ } \\&=2\int_{0}^x \frac{1}{\sqrt{2 \pi}} \ e^{-\frac{t^2}{2}} \ dt \\&\text{ } \end{aligned}

Upon differentiation of this cdf, we have the probability density function (pdf) of X.

\displaystyle (2) \ \ \ \ \ f(x)=\sqrt{\frac{2}{\pi}} \ e^{-\frac{x^2}{2}}

The following is the calculation for E(X).

\displaystyle (3) \ \ \ \ \ E(X)=\sqrt{\frac{2}{\pi}} \ \int_0^\infty  \ x e^{-\frac{x^2}{2}}=\sqrt{\frac{2}{\pi}}

A Problem of Rolling Six Dice

Problem
Suppose that we roll 6 fair dice (or equivalently, roll a fair die 6 times). Let X be the number of distinct faces that appear. Find the probability function P(X=k) where k=1,2,3,4,5,6.

Equivalent Problem
Suppose that we randomly assign 6 candies to 6 children (imagine that each candy is to be thrown at random to the children and is received by one of the children). What is the probability that exactly k children have been given candies, where k=1,2,3,4,5,6?

___________________________________________________________________________

Discussion
Note that both descriptions are equivalent and are refered to as occupancy problem in [1]. The essential fact here is that n objects are randomly assigned to m cells. The problem then asks: what is the probability that k of the cells are occupied? See the following posts for more detailed discussions of the occupancy problem.

Each of these posts presents different different ways of solving the occupancy problem. The first post uses a counting approach based on the multinomial coefficients. The second post developed a formula for finding the probability that exactly k of the cells are empty.

The first approach of using mulltinomial coefficients is preferred when the number of objects n and the number of cells m are relatively small (such as the problem indicated here). Otherwise, use the formula approach.

___________________________________________________________________________

Using the approach of multinomial coefficients as shown in this post (the first post indicated above), we have the following answers:

    \displaystyle P(X=1)=\frac{6}{6^6}=\frac{6}{46656}

    \displaystyle P(X=2)=\frac{930}{6^6}=\frac{930}{46656}

    \displaystyle P(X=3)=\frac{10800}{6^6}=\frac{10800}{46656}

    \displaystyle P(X=4)=\frac{23400}{6^6}=\frac{23400}{46656}

    \displaystyle P(X=5)=\frac{10800}{6^6}=\frac{10800}{46656}

    \displaystyle P(X=6)=\frac{720}{6^6}=\frac{720}{46656}

For more practice problems on calculating the occupancy problem, see this post.
___________________________________________________________________________

Reference

  1. Feller, W., An Introduction to Probability Theory and its Applications, Vol. I, 3rd ed., John Wiley & Sons, Inc., New York, 1968