Tag Archives: Joint Distribution

Practice Problem Set 7 – a discrete joint distribution

The practice problems presented here deal with a discrete joint distribution that is defined by multiplying a marginal distribution and a conditional distribution – similar to the joint distribution found here and here. Thus this post provides additional practice opportunities.

Practice Problems

Let X be the value of a roll of a fair die. For X=x, suppose that Y \lvert X=x has a binomial distribution with n=4 and p=x / 10.

Practice Problem 7-A
Compute the conditional binomial distributions Y \lvert X=x where x=1,2,3,4,5,6.

Practice Problem 7-B
Calculate the joint probability function P[X=x,Y=y] for x=1,2,3,4,5,6 and y=0,1,2,3,4.

Practice Problem 7-C
Determine the probability function for the marginal distribution of Y. Calculate the mean and variance of Y.

Practice Problem 7-D
Calculate the backward conditional probabilities P[X=x \lvert Y=y] for all applicable x and y.

Problems 7-A to 7-D are similar to the ones in this previous post.

Practice Problem 7-E
Calculate the mean and variance of X.

Practice Problem 7-F
Calculate the mean and variance of Y (use the methods discussed here).

Practice Problem 7-G
Calculate the covariance \text{Cov}(X,Y) and the correlation coefficient \rho.

Problems 7-E to 7-G are similar to the ones in this previous post.

.

.

.

.

.

.

.

.

Answers

Practice Problem 7-A

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=1]=0.6561 \\&P[Y=1 \lvert X=1]=0.2916 \\&P[Y=2 \lvert X=1]=0.0486 \\&P[Y=3 \lvert X=1]=0.0036 \\&P[Y=4 \lvert X=1]=0.0001 \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=2]=0.4096 \\&P[Y=1 \lvert X=2]=0.4096 \\&P[Y=2 \lvert X=2]=0.1536 \\&P[Y=3 \lvert X=2]=0.0256 \\&P[Y=4 \lvert X=2]=0.0016 \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=3]=0.2401 \\&P[Y=1 \lvert X=3]=0.4116 \\&P[Y=2 \lvert X=3]=0.2646 \\&P[Y=3 \lvert X=3]=0.0756 \\&P[Y=4 \lvert X=3]=0.0081 \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=4]=0.1296 \\&P[Y=1 \lvert X=4]=0.3456 \\&P[Y=2 \lvert X=4]=0.3456 \\&P[Y=3 \lvert X=4]=0.1536 \\&P[Y=4 \lvert X=4]=0.0256 \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=5]=0.0625 \\&P[Y=1 \lvert X=5]=0.25 \\&P[Y=2 \lvert X=5]=0.375 \\&P[Y=3 \lvert X=5]=0.25 \\&P[Y=4 \lvert X=5]=0.0625 \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0 \lvert X=6]=0.0256 \\&P[Y=1 \lvert X=6]=0.1536 \\&P[Y=2 \lvert X=6]=0.3456 \\&P[Y=3 \lvert X=6]=0.3456 \\&P[Y=4 \lvert X=6]=0.1296 \end{aligned}

Practice Problem 7-B

    \displaystyle \begin{aligned} &P[Y=4,X=1]=\frac{0.0001}{6} \\&P[Y=4,X=2]=\frac{0.0016}{6} \\&P[Y=4,X=3]=\frac{0.0081}{6} \\&P[Y=4,X=4]=\frac{0.0256}{6} \\&P[Y=4,X=5]=\frac{0.0625}{6} \\&P[Y=4,X=6]=\frac{0.1296}{6} \end{aligned}

    \displaystyle \begin{aligned} &P[Y=3,X=1]=\frac{0.0036}{6} \\&P[Y=3,X=2]=\frac{0.0256}{6} \\&P[Y=3,X=3]=\frac{0.0756}{6} \\&P[Y=3,X=4]=\frac{0.1536}{6} \\&P[Y=3,X=5]=\frac{0.25}{6} \\&P[Y=3,X=6]=\frac{0.3456}{6} \end{aligned}

    \displaystyle \begin{aligned} &P[Y=2,X=1]=\frac{0.0486}{6} \\&P[Y=2,X=2]=\frac{0.1536}{6} \\&P[Y=2,X=3]=\frac{0.2646}{6} \\&P[Y=2,X=4]=\frac{0.3456}{6} \\&P[Y=2,X=5]=\frac{0.375}{6} \\&P[Y=2,X=6]=\frac{0.3456}{6} \end{aligned}

    \displaystyle \begin{aligned} &P[Y=1,X=1]=\frac{0.2916}{6} \\&P[Y=1,X=2]=\frac{0.4096}{6} \\&P[Y=1,X=3]=\frac{0.4116}{6} \\&P[Y=1,X=4]=\frac{0.3456}{6} \\&P[Y=1,X=5]=\frac{0.25}{6} \\&P[Y=1,X=6]=\frac{0.1536}{6} \end{aligned}

    \displaystyle \begin{aligned} &P[Y=0,X=1]=\frac{0.6561}{6} \\&P[Y=0,X=2]=\frac{0.4096}{6} \\&P[Y=0,X=3]=\frac{0.2401}{6} \\&P[Y=0,X=4]=\frac{0.1296}{6} \\&P[Y=0,X=5]=\frac{0.0625}{6} \\&P[Y=0,X=6]=\frac{0.0256}{6} \end{aligned}

Practice Problem 7-C

    \displaystyle \begin{aligned} &P[Y=4]=\frac{0.2275}{6} \\&P[Y=3]=\frac{0.854}{6} \\&P[Y=2]=\frac{1.533}{6} \\&P[Y=1]=\frac{1.862}{6} \\&P[Y=0]=\frac{1.5235}{6} \end{aligned}

    \displaystyle E[Y]=1.4

    \displaystyle E[Y^2]=3.22

    \displaystyle Var[Y]=1.26

Practice Problem 7-D

    \displaystyle \begin{aligned} &P[X=1 \lvert Y=0]=\frac{0.6561}{1.5235}=0.4307 \\&P[X=2 \lvert Y=0]=\frac{0.4096}{1.5235}=0.2689 \\&P[X=3 \lvert Y=0]=\frac{0.2401}{1.5235}=0.1576 \\&P[X=4 \lvert Y=0]=\frac{0.1296}{1.5235}=0.0851 \\&P[X=5 \lvert Y=0]=\frac{0.0625}{1.5235}=0.0410 \\&P[X=6 \lvert Y=0]=\frac{0.0256}{1.5235}=0.0168 \end{aligned}

    \displaystyle \begin{aligned} &P[X=1 \lvert Y=1]=\frac{0.2916}{1.862}=0.1566 \\&P[X=2 \lvert Y=1]=\frac{0.4096}{1.862}=0.2200 \\&P[X=3 \lvert Y=1]=\frac{0.4116}{1.862}=0.2211 \\&P[X=4 \lvert Y=1]=\frac{0.3456}{1.862}=0.1856 \\&P[X=5 \lvert Y=1]=\frac{0.25}{1.862}=0.1343 \\&P[X=6 \lvert Y=1]=\frac{0.1536}{1.862}=0.0825 \end{aligned}

    \displaystyle \begin{aligned} &P[X=1 \lvert Y=2]=\frac{0.0486}{1.533}=0.0317 \\&P[X=2 \lvert Y=2]=\frac{0.1536}{1.533}=0.1002 \\&P[X=3 \lvert Y=2]=\frac{0.2646}{1.533}=0.1726 \\&P[X=4 \lvert Y=2]=\frac{0.3456}{1.533}=0.2254 \\&P[X=5 \lvert Y=2]=\frac{0.375}{1.533}=0.2446 \\&P[X=6 \lvert Y=2]=\frac{0.3456}{1.533}=0.2254 \end{aligned}

    \displaystyle \begin{aligned} &P[X=1 \lvert Y=3]=\frac{0.0036}{0.854}=0.0042 \\&P[X=2 \lvert Y=3]=\frac{0.0256}{0.854}=0.0300 \\&P[X=3 \lvert Y=3]=\frac{0.0756}{0.854}=0.0885 \\&P[X=4 \lvert Y=3]=\frac{0.1536}{0.854}=0.1799 \\&P[X=5 \lvert Y=3]=\frac{0.25}{0.854}=0.2927 \\&P[X=6 \lvert Y=3]=\frac{0.3456}{0.854}=0.4047 \end{aligned}

    \displaystyle \begin{aligned} &P[X=1 \lvert Y=4]=\frac{0.0001}{0.2275}=0.0004 \\&P[X=2 \lvert Y=4]=\frac{0.0016}{0.2275}=0.0070 \\&P[X=3 \lvert Y=4]=\frac{0.0081}{0.2275}=0.0356 \\&P[X=4 \lvert Y=4]=\frac{0.0256}{0.2275}=0.1125 \\&P[X=5 \lvert Y=4]=\frac{0.0625}{0.2275}=0.2747 \\&P[X=6 \lvert Y=4]=\frac{0.1296}{0.2275}=0.5697 \end{aligned}

Practice Problem 7-E

    \displaystyle E[X]=\frac{7}{2}=3.5

    \displaystyle E[X^2]=\frac{91}{6}

    \displaystyle Var[X]=\frac{35}{12}

Practice Problem 7-F

    \displaystyle E[Y]=1.4

    \displaystyle E[Y^2]=3.22

    \displaystyle Var[Y]=1.26

Practice Problem 7-G

    \displaystyle \text{Cov}(X,Y)=\frac{7}{6}

    \displaystyle \rho=\frac{7}{6 \sqrt{3.675}}=0.60858

Dan Ma statistical

Daniel Ma statistical

Dan Ma practice problems

Daniel Ma practice problems

Daniel Ma mathematics

Dan Ma math

Daniel Ma probability

Dan Ma probability

Daniel Ma statistics

Dan Ma statistics

Dan Ma mathematical

Daniel Ma mathematical

\copyright 2019 – Dan Ma

Advertisements

Practice Problem Set 4 – Correlation Coefficient

This post provides practice problems to reinforce the concept of correlation coefficient discussed in this
post in a companion blog. The post in the companion blog shows how to evaluate the covariance \text{Cov}(X,Y) and the correlation coefficient \rho of two continuous random variables X and Y. It also discusses the connection between \rho and the regression curve E[Y \lvert X=x] and the least squares regression line.

The structure of the practice problems found here is quite simple. Given a joint density function for a pair of random variables X and Y (with an appropriate region in the xy-plane as support), determine the following four pieces of information.

  • The covariance \text{Cov}(X,Y)
  • The correlation coefficient \rho
  • The regression curve E[Y \lvert X=x]
  • The least squares regression line y=a+b x

The least squares regression line y=a+bx whose slope b and y-intercept a are given by:

    \displaystyle b=\rho \ \frac{\sigma_Y}{\sigma_X}

    \displaystyle a=\mu_Y-b \ \mu_X

where \mu_X=E[X], \sigma_X^2=Var[X], \mu_Y=E[Y] and \sigma_Y^2=Var[Y].

.

For some of the problems, the regression curves E[Y \lvert X=x] coincide with the least squares regression lines. When the regression curve is in a linear form, it coincides with the least squares regression line.

As mentioned, the practice problems are to reinforce the concepts discussed in this post.

.

Practice Problem 4-A
    \displaystyle f(x,y)=\frac{3}{4} \ (2-y) \ \ \ \ \ \ \ 0<x<y<2

\text{ }

Practice Problem 4-B
    \displaystyle f(x,y)=\frac{1}{2} \ \ \ \ \ \ \ \ \ \ \ \ 0<x<y<2

\text{ }

Practice Problem 4-C
    \displaystyle f(x,y)=\frac{1}{8} \ (x+y) \ \ \ \ \ \ \ \ \ 0<x<2, \ 0<y<2

\text{ }

Practice Problem 4-D
    \displaystyle f(x,y)=\frac{1}{2 \ x^2} \ \ \ \  \ \ \ \ \ \ \ \ \ 0<x<2, \ 0<y<x^2

\text{ }

Practice Problem 4-E
    \displaystyle f(x,y)=\frac{1}{2} \ (x+y) \ e^{-x-y} \ \ \ \  \ \ \ \ \ \ \ \ \ x>0, \ y>0

\text{ }

Practice Problem 4-F
    \displaystyle f(x,y)=\frac{3}{8} \ x \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ 0<y<x<2

\text{ }

Practice Problem 4-G
    \displaystyle f(x,y)=\frac{1}{2} \ xy \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ 0<y<x<2

\text{ }

Practice Problem 4-H
    \displaystyle f(x,y)=\frac{3}{14} \ (xy +x) \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ 0<y<x<2

\text{ }

Practice Problem 4-I
    \displaystyle f(x,y)=\frac{3}{32} \ (x+y) \ xy \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ 0<x<2, \ 0<y<2

\text{ }

Practice Problem 4-J
    \displaystyle f(x,y)=\frac{3y}{(x+1)^6} \ \ e^{-y/(x+1)} \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ x>0, \ y>0

\text{ }

Practice Problem 4-K
    \displaystyle f(x,y)=\frac{y}{(x+1)^4} \ \ e^{-y/(x+1)} \ \ \ \ \ \ \  \ \ \ \ \ \ \ \ \ x>0, \ y>0

For this problem, only work on the regression curve E[Y \lvert X=x]. Note that E[X] and Var[X] do not exist.

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

Problem ………..Answer
4-A
  • \displaystyle \text{Cov}(X,Y)=\frac{1}{10}
  • \displaystyle \rho=\sqrt{\frac{1}{3}}=0.57735
  • \displaystyle E[Y \lvert X=x]=\frac{2 (4-3 x^2+x^3)}{3 (4- 4x+x^2)}=\frac{2 (2+x-x^2)}{3 (2-x)} \ \ \ \ \ 0<x<2
  • \displaystyle y=\frac{2}{3} \ (x+1)
4-B
  • \displaystyle \text{Cov}(X,Y)=\frac{1}{9}
  • \displaystyle \rho=\frac{1}{2}
  • \displaystyle E[Y \lvert X=x]=1+\frac{1}{2} x \ \ \ \ \ 0<x<2
  • \displaystyle y=1+\frac{1}{2} x
4-C
  • \displaystyle \text{Cov}(X,Y)=-\frac{1}{36}
  • \displaystyle \rho=-\frac{1}{11}
  • \displaystyle E[Y \lvert X=x]=\frac{x+\frac{4}{3}}{x+1} \ \ \ \ \ 0<x<2
  • \displaystyle y=\frac{14}{11}-\frac{1}{11} x
4-D
  • \displaystyle \text{Cov}(X,Y)=\frac{1}{3}
  • \displaystyle \rho=\frac{1}{2} \ \sqrt{\frac{15}{7}}=0.7319
  • \displaystyle E[Y \lvert X=x]=\frac{x^2}{2} \ \ \ \ \ 0<x<2
  • \displaystyle y=-\frac{1}{3}+ x
4-E
  • \displaystyle \text{Cov}(X,Y)=-\frac{1}{4}
  • \displaystyle \rho=-\frac{1}{7}=-0.1429
  • \displaystyle E[Y \lvert X=x]=\frac{x+2}{x+1} \ \ \ \ \ x>0
  • \displaystyle y=\frac{12}{7}-\frac{1}{7} x
4-F
  • \displaystyle \text{Cov}(X,Y)=-\frac{3}{40}
  • \displaystyle \rho=\frac{3}{\sqrt{19}}=0.3974
  • \displaystyle E[Y \lvert X=x]=\frac{x}{2} \ \ \ \ \ 0<x<2
  • \displaystyle y=\frac{x}{2}
4-G
  • \displaystyle \text{Cov}(X,Y)=\frac{16}{225}
  • \displaystyle \rho=\frac{4}{\sqrt{66}}=0.4924
  • \displaystyle E[Y \lvert X=x]=\frac{2}{3} x \ \ \ \ \ 0<x<2
  • \displaystyle y=\frac{2}{3} x
4-H
  • \displaystyle \text{Cov}(X,Y)=\frac{298}{3675}
  • \displaystyle \rho=\frac{149}{3 \sqrt{12259}}=0.4486
  • \displaystyle E[Y \lvert X=x]=\frac{x (2x+3)}{3x+6}  \ \ \ \ \ 0<x<2
  • \displaystyle y=-\frac{2}{41}+\frac{149}{246} x
4-I
  • \displaystyle \text{Cov}(X,Y)=-\frac{1}{144}
  • \displaystyle \rho=-\frac{5}{139}=-0.03597
  • \displaystyle E[Y \lvert X=x]=\frac{4x+6}{3x+4}  \ \ \ \ \ 0<x<2
  • \displaystyle y=\frac{204}{139}-\frac{5}{139} x
4-J
  • \displaystyle \text{Cov}(X,Y)=\frac{3}{2}
  • \displaystyle \rho=\frac{1}{\sqrt{3}}=0.57735
  • \displaystyle E[Y \lvert X=x]=2 (x+1) \ \ \ \ \ x>0
  • \displaystyle y=2 (x+1)
4-K
  • \displaystyle E[Y \lvert X=x]=2 (x+1) \ \ \ \ \ x>0

Daniel Ma mathematics

Dan Ma math

Daniel Ma probability

Dan Ma probability

Daniel Ma statistics

Dan Ma statistics

\copyright 2018 – Dan Ma

Practice Problems for Conditional Distributions, Part 1

The following are practice problems on conditional distributions. The thought process of how to work with these practice problems can be found in the blog post Conditionals Distribution, Part 1.

_____________________________________________________________________________________

Description of Problems

Suppose X and Y are independent binomial distributions with the following parameters.

    For X, number of trials n=5, success probability \displaystyle p=\frac{1}{2}

    For Y, number of trials n=5, success probability \displaystyle p=\frac{3}{4}

We can think of these random variables as the results of two students taking a multiple choice test with 5 questions. For example, let X be the number of correct answers for one student and Y be the number of correct answers for the other student. For the practice problems below, passing the test means having 3 or more correct answers.

Suppose we have some new information about the results of the test. The problems below are to derive the conditional distributions of X or Y based on the new information and to compare the conditional distributions with the unconditional distributions.

Practice Problem 1

  • New information: X<Y.
  • Derive the conditional distribution for X \lvert X<Y.
  • Derive the conditional distribution for Y \lvert X<Y.
  • Compare these conditional distributions with the unconditional ones with respect to mean and probability of passing.
  • What is the effect of the new information on the test performance of each of the students?
  • Explain why the new information has the effect on the test performance?

Practice Problem 2

  • New information: X>Y.
  • Derive the conditional distribution for X \lvert X>Y.
  • Derive the conditional distribution for Y \lvert X>Y.
  • Compare these conditional distributions with the unconditional ones with respect to mean and probability of passing.
  • What is the effect of the new information on the test performance of each of the students?
  • Explain why the new information has the effect on the test performance?

Practice Problem 3

  • New information: Y=X+1.
  • Derive the conditional distribution for X \lvert Y=X+1.
  • Derive the conditional distribution for Y \lvert Y=X+1.
  • Compare these conditional distributions with the unconditional ones with respect to mean and probability of passing.
  • What is the effect of the new information on the test performance of each of the students?
  • Explain why the new information has the effect on the test performance?

_____________________________________________________________________________________

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }
_____________________________________________________________________________________

Partial Answers

To let you know that you are on the right track, the conditional distributions are given below.

The thought process of how to work with these practice problems can be found in the blog post Conditional Distributions, Part 1.

Practice Problem 1

    \displaystyle P(X=0 \lvert X<Y)=\frac{1023}{22938}=0.0446

    \displaystyle P(X=1 \lvert X<Y)=\frac{5040}{22938}=0.2197

    \displaystyle P(X=2 \lvert X<Y)=\frac{9180}{22938}=0.4

    \displaystyle P(X=3 \lvert X<Y)=\frac{6480}{22938}=0.2825

    \displaystyle P(X=4 \lvert X<Y)=\frac{1215}{22938}=0.053

    ____________________

    \displaystyle P(Y=1 \lvert X<Y)=\frac{10}{22933}=0.0004

    \displaystyle P(Y=2 \lvert X<Y)=\frac{540}{22933}=0.0235

    \displaystyle P(Y=3 \lvert X<Y)=\frac{4320}{22933}=0.188

    \displaystyle P(Y=4 \lvert X<Y)=\frac{10530}{22933}=0.459

    \displaystyle P(Y=5 \lvert X<Y)=\frac{7533}{22933}=0.328

Practice Problem 2

    \displaystyle P(X=1 \lvert X>Y)=\frac{5}{3386}=0.0013

    \displaystyle P(X=2 \lvert X>Y)=\frac{160}{3386}=0.04

    \displaystyle P(X=3 \lvert X>Y)=\frac{1060}{3386}=0.2728

    \displaystyle P(X=4 \lvert X>Y)=\frac{1880}{3386}=0.4838

    \displaystyle P(X=5 \lvert X>Y)=\frac{781}{3386}=0.2

    ____________________

    \displaystyle P(Y=0 \lvert X>Y)=\frac{31}{3386}=0.008

    \displaystyle P(Y=1 \lvert X>Y)=\frac{390}{3386}=0.1

    \displaystyle P(Y=2 \lvert X>Y)=\frac{1440}{3386}=0.37

    \displaystyle P(Y=3 \lvert X>Y)=\frac{1620}{3386}=0.417

    \displaystyle P(Y=4 \lvert X>Y)=\frac{405}{3386}=0.104

Practice Problem 3

    \displaystyle P(X=0 \lvert Y=X+1)=\frac{15}{8430}=0.002

    \displaystyle P(X=1 \lvert Y=X+1)=\frac{450}{8430}=0.053

    \displaystyle P(X=2 \lvert Y=X+1)=\frac{2700}{8430}=0.32

    \displaystyle P(X=3 \lvert Y=X+1)=\frac{4050}{8430}=0.48

    \displaystyle P(X=4 \lvert Y=X+1)=\frac{1215}{8430}=0.144

    ____________________

    \displaystyle P(Y=1 \lvert Y=X+1)=\frac{15}{8430}=0.002

    \displaystyle P(Y=2 \lvert Y=X+1)=\frac{450}{8430}=0.053

    \displaystyle P(Y=3 \lvert Y=X+1)=\frac{2700}{8430}=0.32

    \displaystyle P(Y=4 \lvert Y=X+1)=\frac{4050}{8430}=0.48

    \displaystyle P(Y=5 \lvert Y=X+1)=\frac{1215}{8430}=0.144

_____________________________________________________________________________________

\copyright \ 2013 \text{ by Dan Ma}

Another Example on Calculating Covariance

In a previous post called An Example on Calculating Covariance, we calculated the covariance and correlation coefficient of a discrete joint distribution where the conditional mean E(Y \lvert X=x) is a linear function of x. In this post, we give examples in the continuous case. Problem A is worked out and Problem B is left as exercise.

The examples presented here are also found in the post called Another Example of a Joint Distribution. Some of the needed calculations are found in this previous post.

____________________________________________________________________

Problem A
Let X be a random variable with the density function f_X(x)=\alpha^2 \ x \ e^{-\alpha x} where x>0. For each realized value X=x, the conditional variable Y \lvert X=x is uniformly distributed over the interval (0,x), denoted symbolically by Y \lvert X=x \sim U(0,x). Obtain solutions for the following:

  1. Calculate the density function, the mean and the variance for the conditional variable Y \lvert X=x.
  2. Calculate the density function, the mean and the variance for the conditional variable X \lvert Y=y.
  3. Use the fact that the conditional mean E(Y \lvert X=x) is a linear function of x to calculate the covariance Cov(X,Y) and the correlation coefficient \rho.

Problem B
Let X be a random variable with the density function f_X(x)=4 \ x^3 where 0<x<1. For each realized value X=x, the conditional variable Y \lvert X=x is uniformly distributed over the interval (0,x), denoted symbolically by Y \lvert X=x \sim U(0,x). Obtain solutions for the following:

  1. Calculate the density function, the mean and the variance for the conditional variable Y \lvert X=x.
  2. Calculate the density function, the mean and the variance for the conditional variable X \lvert Y=y.
  3. Use the fact that the conditional mean E(Y \lvert X=x) is a linear function of x to calculate the covariance Cov(X,Y) and the correlation coefficient \rho.

____________________________________________________________________

Background Results

Here’s the idea behind the calculation of correlation coefficient in this post. Suppose X and Y are jointly distributed. When the conditional mean E(Y \lvert X=x) is a linear function of x, that is, E(Y \lvert X=x)=a+bx for some constants a and b, it can be written as the following:

    \displaystyle E(Y \lvert X=x)=\mu_Y + \rho \ \frac{\sigma_Y}{\sigma_X} \ (x - \mu_X)

Here, \mu_X=E(X) and \mu_Y=E(Y). The notations \sigma_X and \sigma_Y refer to the standard deviation of X and Y, respectively. Of course, \rho refers to the correlation coefficient in the joint distribution of X and Y and is defined by:

    \displaystyle \rho=\frac{Cov(X,Y)}{\sigma_X \ \sigma_Y}

where Cov(X,Y) is the covariance of X and Y and is defined by

    Cov(X,Y)=E[(X-\mu_X) \ (Y-\mu_Y)]

or equivalently by Cov(X,Y)=E(X,Y)-\mu_X \mu_Y.

Just to make it clear, in the joint distribution of X and Y, if the conditional mean E(X \lvert Y=y) is a linear function of y, then we have:

    \displaystyle E(X \lvert Y=y)=\mu_X + \rho \ \frac{\sigma_X}{\sigma_Y} \ (y - \mu_Y)

____________________________________________________________________

Discussion of Problem A

Problem A-1

Since for each x, Y \lvert X=x has the uniform distribution U(0,x), we have the following:

    \displaystyle f_{Y \lvert X=x}=\frac{1}{x} for x>0

    \displaystyle E(Y \lvert X=x)=\frac{x}{2}

    \displaystyle Var(Y \lvert X=x)=\frac{x^2}{12}

Problem A-2

In a previous post called Another Example of a Joint Distribution, the joint density function of X and Y is calculated to be: f_{X,Y}(x,y)=\alpha^2 \ e^{-\alpha x}. In the same post, the marginal density of Y is calculated to be: f_Y(y)=\alpha e^{-\alpha y} (exponentially distributed). Thus we have:

    \displaystyle \begin{aligned} f_{X \lvert Y=y}(x \lvert y)&=\frac{f_{X,Y}(x,y)}{f_Y(y)} \\&=\frac{\alpha^2 \ e^{-\alpha x}}{\alpha \ e^{-\alpha \ y}} \\&=\alpha \ e^{-\alpha \ (x-y)} \text{ where } y<x<\infty \end{aligned}

Thus the conditional variable X \lvert Y=y has an exponential distribution that is shifted to the right by the amount y. Thus we have:

    \displaystyle E(X \lvert Y=y)=\frac{1}{\alpha}+y

    \displaystyle Var(Y \lvert X=x)=\frac{1}{\alpha^2}

Problem A-3

To compute the covariance Cov(X,Y), one approach is to use the definition indicated above (to see this calculation, see Another Example of a Joint Distribution). Here we use the idea that the conditional mean \displaystyle E(Y \lvert X=x) is linear in x. From the previous post Another Example of a Joint Distribution, we have:

    \displaystyle \sigma_X=\frac{\sqrt{2}}{\alpha}

    \displaystyle \sigma_Y=\frac{1}{\alpha}

Plugging in \sigma_X and \sigma_Y, we have the following calculation:

    \displaystyle \rho \ \frac{\sigma_Y}{\sigma_X}=\frac{1}{2}

    \displaystyle \rho = \frac{\sigma_X}{\sigma_Y} \times \frac{1}{2}=\frac{\sqrt{2}}{2}=\frac{1}{\sqrt{2}}=0.7071

    \displaystyle Cov(X,Y)=\rho \ \sigma_X \ \sigma_Y=\frac{1}{\alpha^2}

____________________________________________________________________

Answers for Problem B

Problem B-1

    \displaystyle E(Y \lvert X=x)=\frac{x}{2}

    \displaystyle Var(Y \lvert X=x)=\frac{x^2}{12}

Problem B-2

    \displaystyle f_{X \lvert Y=y}(x \lvert y)=\frac{4 \ x^2}{1-y^3} where 0<y<1 and y<x<1

Problem B-3

    \displaystyle \rho=\frac{\sqrt{3}}{2 \ \sqrt{7}}=0.3273268

    \displaystyle Cov(X,Y)=\frac{1}{75}

____________________________________________________________________
\copyright \ 2013

Another Example of a Joint Distribution

In an earlier post called An Example of a Joint Distribution, we worked a problem involving a joint distribution that is constructed from taking product of a conditional distribution and a marginial distribution (both discrete distributions). In this post, we work on similar problems for the continuous case. We work problem A. Problem B is left as exercises.

_________________________________________________________________

Problem A
Let X be a random variable with the density function f_X(x)=\alpha^2 \ x \ e^{-\alpha x} where x>0. For each realized value X=x, the conditional variable Y \lvert X=x is uniformly distributed over the interval (0,x), denoted symbolically by Y \lvert X=x \sim U(0,x). Obtain solutions for the following:

  1. Discuss the joint density function for X and Y.
  2. Calculate the marginal distribution of X, in particular the mean and variance.
  3. Calculate the marginal distribution of Y, in particular, the density function, mean and variance.
  4. Use the joint density in part A-1 to calculate the covariance Cov(X,Y) and the correlation coefficient \rho.

_________________________________________________________________

Problem B
Let X be a random variable with the density function f_X(x)=4 \ x^3 where 0<x<1. For each realized value X=x, the conditional variable Y \lvert X=x is uniformly distributed over the interval (0,x), denoted symbolically by Y \lvert X=x \sim U(0,x). Obtain solutions for the following:

  1. Discuss the joint density function for X and Y.
  2. Calculate the marginal distribution of X, in particular the mean and variance.
  3. Calculate the marginal distribution of Y, in particular, the density function, mean and variance.
  4. Use the joint density in part B-1 to calculate the covariance Cov(X,Y) and the correlation coefficient \rho.

_________________________________________________________________

Discussion of Problem A

Problem A-1

The support of the joint density function f_{X,Y}(x,y) is the unbounded lower triangle in the xy-plane (see the shaded region in green in the figure below).

Figure 1

The unbounded green region consists of vertical lines: for each x>0, y ranges from 0 to x (the red vertical line in the figure below is one such line).

Figure 2

For each point (x,y) in each vertical line, we assign a density value f_{X,Y}(x,y) which is a positive number. Taken together these density values sum to 1.0 and describe the behavior of the variables X and Y across the green region. If a realized value of X is x, then the conditional density function of Y \lvert X=x is:

    \displaystyle f_{Y \lvert X=x}(y \lvert x)=\frac{f_{X,Y}(x,y)}{f_X(x)}

Thus we have f_{X,Y}(x,y) = f_{Y \lvert X=x}(y \lvert x) \times f_X(x). In our problem at hand, the joint density function is:

    \displaystyle \begin{aligned} f_{X,Y}(x,y)&=f_{Y \lvert X=x}(y \lvert x) \times f_X(x) \\&=\frac{1}{x} \times \alpha^2 \ x \ e^{-\alpha x} \\&=\alpha^2 \ e^{-\alpha x}  \end{aligned}

As indicated above, the support of f_{X,Y}(x,y) is the region x>0 and 0<y<x (the region shaded green in the above figures).

Problem A-2

The unconditional density function of X is f_X(x)=\alpha^2 \ x \ e^{-\alpha x} (given above in the problem) is the density function of the sum of two independent exponential variables with the common density f(x)=\alpha e^{-\alpha x} (see this blog post for the derivation using convolution method). Since X is the independent sum of two identical exponential distributions, the mean and variance of X is twice that of the same item of the exponential distribution. We have:

    \displaystyle E(X)=\frac{2}{\alpha}

    \displaystyle Var(X)=\frac{2}{\alpha^2}

Problem A-3

To find the marginal density of Y, for each applicable y, we need to sum out the x. According to the following figure, for each y, we sum out all x values in a horizontal line such that y<x<\infty (see the blue horizontal line).

Figure 3

Thus we have:

    \displaystyle \begin{aligned} f_Y(y)&=\int_y^\infty f_{X,Y}(x,y) \ dy \ dx \\&=\int_y^\infty \alpha^2 \ e^{-\alpha x} \ dy \ dx \\&=\alpha \int_y^\infty \alpha \ e^{-\alpha x} \ dy \ dx \\&= \alpha e^{-\alpha y}  \end{aligned}

Thus the marginal distribution of Y is an exponential distribution. The mean and variance of Y are:

    \displaystyle E(Y)=\frac{1}{\alpha}

    \displaystyle Var(Y)=\frac{1}{\alpha^2}

Problem A-4

The covariance of X and Y is defined as Cov(X,Y)=E[(X-\mu_X) (Y-\mu_Y)], which is equivalent to:

    \displaystyle Cov(X,Y)=E(X Y)-\mu_X \mu_Y

where \mu_X=E(X) and \mu_Y=E(Y). Knowing the joint density f_{X,Y}(x,y), we can calculate Cov(X,Y) directly. We have:

    \displaystyle \begin{aligned} E(X Y)&=\int_0^\infty \int_0^x  xy \ f_{X,Y}(x,y) \ dy \ dx \\&=\int_0^\infty \int_0^x xy \ \alpha^2 \ e^{-\alpha x} \ dy \ dx \\&=\int_0^\infty \frac{\alpha^2}{2} \ x^3 \ e^{-\alpha x} \ dy \ dx \\&= \frac{3}{\alpha^2} \int_0^\infty \frac{\alpha^4}{3!} \ x^{4-1} \ e^{-\alpha x} \ dy \ dx \\&= \frac{3}{\alpha^2} \end{aligned}

Note that the last integrand in the last integral in the above derivation is that of a Gamma distribution (hence the integral is 1.0). Now the covariance of X and Y is:

    \displaystyle Cov(X,Y)=\frac{3}{\alpha^2}-\frac{2}{\alpha} \frac{1}{\alpha}=\frac{1}{\alpha^2}

The following is the calculation of the correlation coefficient:

    \displaystyle \begin{aligned} \rho&=\frac{Cov(X,Y)}{\sigma_X \ \sigma_Y} = \frac{\displaystyle \frac{1}{\alpha^2}}{\displaystyle \frac{\sqrt{2}}{\alpha} \ \frac{1}{\alpha}} \\&=\frac{1}{\sqrt{2}} = 0.7071 \end{aligned}

Even without the calculation of \rho, we know that X and Y are positively and quite strongly correlated. The conditional distribution of Y \lvert X=x is U(0,x) which increases with x. The calculation of Cov(X,Y) and \rho confirms our observation.

_________________________________________________________________

Answers for Problem B

Problem B-1

    \displaystyle f_{X,Y}(x,y)=4 \ x^2 where x>0, and 0<y<x.

Problem B-2

    \displaystyle E(X)=\frac{4}{5}
    \displaystyle Var(X)=\frac{2}{75}

Problem B-3

    \displaystyle f_Y(y)=\frac{4}{3} \ (1- y^3)

    \displaystyle E(Y)=\frac{2}{5}

    \displaystyle Var(Y)=\frac{14}{225}

Problem B-4

    \displaystyle Cov(X,Y)=\frac{1}{75}

    \displaystyle \rho = \frac{\sqrt{3}}{2 \sqrt{7}}=0.327327

_________________________________________________________________

An Example on Calculating Covariance

The practice problems presented here are continuation of the problems in this previous post.

Problem 1

Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{4} (we use the notation \text{binom}(x,\frac{1}{4}) to denote this binomial distribution).

  1. Compute the mean and variance of X.
  2. Compute the mean and variance of Y.
  3. Compute the covariance Cov(X,Y) and the correlation coefficient \rho.

Problem 2

Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{2} (we use the notation \text{binom}(x,\frac{1}{2}) to denote this binomial distribution).

  1. Compute the mean and variance of X.
  2. Compute the mean and variance of Y.
  3. Compute the covariance Cov(X,Y) and the correlation coefficient \rho.

Problem 2 is left as exercise. A similar problem is also found in this post.

Discussion of Problem 1

The joint variables X and Y are identical to the ones in this previous post. However, we do not plan on following the approach in the previous, which is to first find the probability functions for the joint distribution and then the marginal distribution of Y. The calculation of covariance in Problem 1.3 can be very tedious by taking this approach.

Problem 1.1
We start with the easiest part, which is the random variable X (the roll of the die). The variance is computed by Var(X)=E(X^2)-E(X)^2.

(1)……\displaystyle  E(X)=\frac{1}{6} \biggl[1+2+3+4+5+6 \biggr]=\frac{21}{6}=3.5

(2)……\displaystyle  E(X^2)=\frac{1}{6} \biggl[1^2+2^2+3^2+4^2+5^2+6^2 \biggr]=\frac{91}{6}

(3)……\displaystyle  Var(X)=\frac{91}{6}-\biggl[\frac{21}{6}\biggr]^2=\frac{105}{36}=\frac{35}{12}

Problem 1.2

We now compute the mean and variance of Y. The calculation of finding the joint distribution and then finding the marginal distribution of Y is tedious and has been done in this previous post. We do not take this approach here. Instead, we find the unconditional mean E(Y) by weighting the conditional mean E(Y \lvert X=x). The weights are the probabilities P(X=x). The following is the idea.

(4)……\displaystyle \begin{aligned}  E(Y)&=E_X[E(Y \lvert X=x)] \\&= E(Y \lvert X=1) \times P(X=1) \\&+ E(Y \lvert X=2) \times P(X=2)\\&+ E(Y \lvert X=3)  \times P(X=3) \\&+ E(Y \lvert X=4)  \times P(X=4) \\&+E(Y \lvert X=5)  \times P(X=5) \\&+E(Y \lvert X=6)  \times P(X=6) \end{aligned}

We have P(X=x)=\frac{1}{6} for each x. Before we do the weighting, we need to have some items about the conditional distribution Y \lvert X=x. Since Y \lvert X=x has a binomial distribution, we have:

(5)……\displaystyle  E(Y \lvert X=x)=\frac{1}{4} \ x

(6)……\displaystyle Var(Y \lvert X=x)=\frac{1}{4} \ \frac{3}{4} \ x=\frac{3}{16} \ x

For any random variable W, Var(W)=E(W^2)-E(W)^2 and E(W^2)=Var(W)+E(W)^2. The following is the second moment of Y \lvert X=x, which is needed in calculating the unconditional variance Var(Y).

(7)……\displaystyle \begin{aligned} E(Y^2 \lvert X=x)&=\frac{3}{16} \ x+\biggl[\frac{1}{4} \ x \biggr]^2 \\&=\frac{3x}{16}+\frac{x^2}{16} \\&=\frac{3x+x^2}{16}  \end{aligned}

We can now do the weighting to get the items of the variable Y.

(8)……\displaystyle \begin{aligned}  E(Y)&=\frac{1}{6} \biggl[\frac{1}{4} +\frac{2}{4}+\frac{3}{4}+ \frac{4}{4}+\frac{5}{4}+\frac{6}{4}\biggr] \\&=\frac{7}{8} \\&=0.875  \end{aligned}

(9)……\displaystyle \begin{aligned}  E(Y^2)&=\frac{1}{6} \biggl[\frac{3(1)+1^2}{16} +\frac{3(2)+2^2}{16}+\frac{3(3)+3^2}{16} \\&+ \frac{3(4)+4^2}{16}+\frac{3(5)+5^2}{16}+\frac{3(6)+6^2}{16}\biggr] \\&=\frac{154}{96} \\&=\frac{77}{48}  \end{aligned}

(10)……\displaystyle \begin{aligned}  Var(Y)&=E(Y^2)-E(Y)^2 \\&=\frac{77}{48}-\biggl[\frac{7}{8}\biggr]^2 \\&=\frac{161}{192} \\&=0.8385 \end{aligned}

Problem 1.3

The following is the definition of covariance of X and Y:

(11)……\displaystyle  Cov(X,Y)=E[(X-\mu_X)(Y-\mu_Y)]

where \mu_X=E(X) and \mu_Y=E(Y).

The definition (11) can be simplified as:

(12)……\displaystyle  Cov(X,Y)=E[XY]-E[X] E[Y]

To compute E[XY], we can use the joint probability function of X and Y to compute this expectation. But this is tedious. Anyone who wants to try can go to this previous post to obtain the joint distribution.

Note that the conditional mean E(Y \lvert X=x)=\frac{x}{4} is a linear function of x. It is a well known result in probability and statistics that whenever a conditional mean E(Y \lvert X=x) is a linear function of x, the conditional mean can be written as:

(13)……\displaystyle E(Y \lvert X=x)=\mu_Y+\rho \ \frac{\sigma_Y}{\sigma_X} \ (x-\mu_X)

where \mu is the mean of the respective variable, \sigma is the standard deviation of the respective variable and \rho is the correlation coefficient. The following relates the correlation coefficient with the covariance.

(14)……\displaystyle  \rho=\frac{Cov(X,Y)}{\sigma_X \ \sigma_Y}

Comparing (5) and (13), we have \displaystyle \rho \frac{\sigma_Y}{\sigma_X}=\frac{1}{4} and

(15)……\displaystyle \rho = \frac{\sigma_X}{4 \ \sigma_Y}

Equating (14) and (15), we have Cov(X,Y)=\frac{\sigma_X^2}{4}. Thus we deduce that Cov(X,Y) is one-fourth of the variance of X. Using (3), we have:

(16)……\displaystyle  Cov(X,Y) = \frac{1}{4} \times \frac{35}{12}=\frac{35}{48}=0.72917

Plug in all the items of (3), (10), and (16) into (14), we obtained \rho=0.46625. Both \rho and Cov(X,Y) are positive, an indication that both variables move together. When one increases, the other variable also increases. Thus makes sense based on the definition of the variables. For example, when the value of the die is large, the number of trials of Y is greater (hence a larger mean).

A similar problem is also found in this post.

.

.

.

.

.

.

.

.

Answers to Problem 2

    \displaystyle E[X]=\frac{7}{2}

    \displaystyle Var[X]=\frac{35}{12}

    \displaystyle E[Y]=\frac{7}{4}

    \displaystyle Var[Y]=\frac{77}{48}

    \displaystyle \text{Cov}(X,Y)=\frac{35}{24}

    \displaystyle \rho=\sqrt{\frac{5}{11}}=0.67419986

Dan Ma statistical

Daniel Ma statistical

Dan Ma practice problems

Daniel Ma practice problems

Daniel Ma mathematics

Dan Ma math

Daniel Ma probability

Dan Ma probability

Daniel Ma statistics

Dan Ma statistics

Dan Ma mathematical

Daniel Ma mathematical

\copyright 2012-2019 – Dan Ma

An Example of a Joint Distribution

This is an excellent problem on the joint distribution of the random variables X and Y where both variables are discrete. The focus is on calculation as well as the intuitive understanding of joint distributions.

Usually a joint distribution is defined by specifying the joint probability function. That is the joint distribution is defined by specifying P[X=x,Y=y] for all possible values of x and y. The joint distribution presented here is defined by the distribution of X (the value of a roll of a die) and the conditional distribution Y \lvert X=x, which is declared to be a binomial distribution with n=x and p=1/4. From this definition, the joint probability function P[X=x,Y=y] is derived. Once the joint probability function is known, the marginal distribution of Y by summing out the x. The inverted conditional distribution X \lvert Y=y is made possible by way of the Bayes’ theorem.

Problem 1

Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{4} (we use the notation \text{binom}(x,\frac{1}{4}) to denote this binomial distribution).

  1. Compute the conditional binomial distributions Y \lvert X=x where x=1,2,3,4,5,6.
  2. Discuss how the joint probability function P[X=x,Y=y] is computed for x=1,2,3,4,5,6 and y=0,1, \cdots, x.
  3. Compute the marginal probability function of Y and the mean and variance of Y.
  4. Compute P(X=x \lvert Y=y) for all applicable x and y.

Readers are encouraged to take out pencil and papers and work Problem 1. Problem 2 is found at the end of the post for additional practice.

A similar problem is also found in this post.

Discussion of Problem 1

This is an example of a joint distribution that is constructed from taking product of a conditional distribution and a marginial distribution. The marginal distribution of X is a uniform distribution on the set \left\{1,2,3,4,5,6 \right\} (rolling a fair die). Conditional of X=x, Y has a binomial distribution \text{binom}(x,\frac{1}{4}). Think of the conditional variable of Y \lvert X=x as tossing a coin x times where the probability of a head is p=\frac{1}{4}. The following is the sample space of the joint distribution of X and Y.

Figure 1

There are 27 points in Figure 1. Each column of points in Figure 1 is a binomial distribution conditional on that x-value. It will be helpful to first nail down the conditional distributions.

Problem 1.1 – conditional binomial distributions

The following shows the calculation of the binomial distributions.

(1)…..\displaystyle \begin{aligned} Y \lvert X=1 \ \ \ \ \ &P(Y=0 \lvert X=1)=\frac{3}{4} \\&P(Y=1 \lvert X=1)=\frac{1}{4} \end{aligned}

(2)…..\displaystyle \begin{aligned} Y \lvert X=2 \ \ \ \ \ &P(Y=0 \lvert X=2)=\binom{2}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^2=\frac{9}{16} \\&P(Y=1 \lvert X=2)=\binom{2}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^1=\frac{6}{16} \\&P(Y=2 \lvert X=2)=\binom{2}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{16} \end{aligned}

(3)…..\displaystyle \begin{aligned} Y \lvert X=3 \ \ \ \ \ &P(Y=0 \lvert X=3)=\binom{3}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^3=\frac{27}{64} \\&P(Y=1 \lvert X=3)=\binom{3}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^2=\frac{27}{64} \\&P(Y=2 \lvert X=3)=\binom{3}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^1=\frac{9}{64} \\&P(Y=3 \lvert X=3)=\binom{3}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{64} \end{aligned}

(4)…..\displaystyle \begin{aligned} Y \lvert X=4 \ \ \ \ \ &P(Y=0 \lvert X=4)=\binom{4}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^4=\frac{81}{256} \\&P(Y=1 \lvert X=4)=\binom{4}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^3=\frac{108}{256} \\&P(Y=2 \lvert X=4)=\binom{4}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^2=\frac{54}{256} \\&P(Y=3 \lvert X=4)=\binom{4}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^1=\frac{12}{256} \\&P(Y=4 \lvert X=4)=\binom{4}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{256} \end{aligned}

(5)…..\displaystyle \begin{aligned} Y \lvert X=5 \ \ \ \ \ &P(Y=0 \lvert X=5)=\binom{5}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^5=\frac{243}{1024} \\&P(Y=1 \lvert X=5)=\binom{5}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^4=\frac{405}{1024} \\&P(Y=2 \lvert X=5)=\binom{5}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^3=\frac{270}{1024} \\&P(Y=3 \lvert X=5)=\binom{5}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^2=\frac{90}{1024} \\&P(Y=4 \lvert X=5)=\binom{5}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^1=\frac{15}{1024} \\&P(Y=5 \lvert X=5)=\binom{5}{5} \biggl(\frac{1}{4}\biggr)^5 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{1024} \end{aligned}

(6)…..\displaystyle \begin{aligned} Y \lvert X=6 \ \ \ \ \ &P(Y=0 \lvert X=6)=\binom{6}{0} \biggl(\frac{1}{4}\biggr)^0 \biggl(\frac{3}{4}\biggr)^6=\frac{729}{4096} \\&P(Y=1 \lvert X=6)=\binom{6}{1} \biggl(\frac{1}{4}\biggr)^1 \biggl(\frac{3}{4}\biggr)^5=\frac{1458}{4096} \\&P(Y=2 \lvert X=6)=\binom{6}{2} \biggl(\frac{1}{4}\biggr)^2 \biggl(\frac{3}{4}\biggr)^4=\frac{1215}{4096} \\&P(Y=3 \lvert X=6)=\binom{6}{3} \biggl(\frac{1}{4}\biggr)^3 \biggl(\frac{3}{4}\biggr)^3=\frac{540}{4096} \\&P(Y=4 \lvert X=6)=\binom{6}{4} \biggl(\frac{1}{4}\biggr)^4 \biggl(\frac{3}{4}\biggr)^2=\frac{135}{4096} \\&P(Y=5 \lvert X=6)=\binom{6}{5} \biggl(\frac{1}{4}\biggr)^5 \biggl(\frac{3}{4}\biggr)^1=\frac{18}{4096} \\&P(Y=6 \lvert X=6)=\binom{6}{6} \biggl(\frac{1}{4}\biggr)^6 \biggl(\frac{3}{4}\biggr)^0=\frac{1}{4096} \end{aligned}

Problem 1.2 – joint probability function

The joint probability function of X and Y may be written as:

(7)…..\displaystyle P(X=x,Y=y)=P(Y=y \lvert X=x) \times P(X=x)

Thus the probability at each point in Figure 1 is the product of P(X=x), which is \frac{1}{6}, with the conditional probability P(Y=y \lvert X=x), which is binomial. In other words, P(X=x,Y=y) is derived from multiplying the binomial probability P(Y=y \lvert X=x) (calculated above) by 1/6. For example, the following diagram and equation demonstrate the calculation of P(X=4,Y=3)

Figure 2

(7a)…..\displaystyle \begin{aligned}P(X=4,Y=3)&=P(Y=3 \lvert X=4) \times P(X=4) \\&=\binom{4}{3} \biggl[\frac{1}{4}\biggr]^3 \biggl[\frac{3}{4}\biggr]^1 \times \frac{1}{6} \\&=\frac{12}{1536}  \end{aligned}

Problem 1.3 – marginal distribution

To find the marginal probability P(Y=y), we need to sum P(X=x,Y=y) over all x to sum out the x. For example, P(Y=2) is the sum of P(X=x,Y=2) for all x=2,3,4,5,6.

Figure 3

As indicated in (7), each P(X=x,Y=2) is the product of a conditional probability P(Y=y \lvert X=x) and P(X=x)=\frac{1}{6}. Thus the probability indicated in Figure 3 can be translated as:

(8)…..\displaystyle \begin{aligned}P(Y=2)&=\sum \limits_{x=2}^6 P(Y=2 \lvert X=x) P(X=x)  \end{aligned}

We now begin the calculation.

(9)…..\displaystyle \begin{aligned} P(Y=0)&=\sum \limits_{x=1}^6 P(Y=0 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{3}{4}+\frac{9}{16}+\frac{27}{64} \\&+ \ \ \ \frac{81}{256}+\frac{243}{1024}+\frac{729}{4096} \biggr] \\&=\frac{10101}{24576} \end{aligned}

(10)…..\displaystyle \begin{aligned} P(Y=1)&=\sum \limits_{x=1}^6 P(Y=1 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{4}+\frac{6}{16}+\frac{27}{64} \\&+ \ \ \ \frac{108}{256}+\frac{405}{1024}+\frac{1458}{4096} \biggr] \\&=\frac{9094}{24576} \end{aligned}

(11)…..\displaystyle \begin{aligned}  P(Y=2)&=\sum \limits_{x=2}^6 P(Y=2 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{16}+\frac{9}{64} \\&+ \ \ \ \frac{54}{256}+\frac{270}{1024}+\frac{1215}{4096} \biggr] \\&=\frac{3991}{24576} \end{aligned}

(12)…..\displaystyle \begin{aligned} P(Y=3)&=\sum \limits_{x=3}^6 P(Y=3 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{64} \\&+ \ \ \ \frac{12}{256}+\frac{90}{1024}+\frac{540}{4096} \biggr] \\&=\frac{1156}{24576} \end{aligned}

(13)…..\displaystyle \begin{aligned} P(Y=4)&=\sum \limits_{x=4}^6 P(Y=4 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{256}+\frac{15}{1024}+\frac{135}{4096} \biggr] \\&=\frac{211}{24576} \end{aligned}

(14)…..\displaystyle \begin{aligned} P(Y=5)&=\sum \limits_{x=5}^6 P(Y=5 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{1024}+\frac{18}{4096} \biggr] \\&=\frac{22}{24576} \end{aligned}

(15)…..\displaystyle \begin{aligned} P(Y=6)&=\sum \limits_{x=6}^6 P(Y=6 \lvert X=x) P(X=x) \\&=\frac{1}{6} \biggl[ \frac{1}{4096} \biggr] \\&=\frac{1}{24576} \end{aligned}

The following is the calculation of the mean and variance of Y.

(16)…..\displaystyle \begin{aligned} E(Y)&=\frac{10101}{24576} \times 0+\frac{9094}{24576} \times 1+\frac{3991}{24576} \times 2  \\&+ \ \ \ \ \frac{1156}{24576} \times 3+\frac{211}{24576} \times 4+\frac{22}{24576} \times 5 \\&+ \ \ \ \ \frac{1}{24576} \times 6  \\&=\frac{21504}{24576}\\&=0.875 \end{aligned}

(17)…..\displaystyle \begin{aligned} E(Y^2)&=\frac{10101}{24576} \times 0+\frac{9094}{24576} \times 1+\frac{3991}{24576} \times 2^2  \\&+ \ \ \ \ \frac{1156}{24576} \times 3^2+\frac{211}{24576} \times 4^2+\frac{22}{24576} \times 5^2 \\&+ \ \ \ \ \frac{1}{24576} \times 6^2  \\&=\frac{39424}{24576}\\&=\frac{77}{48} \end{aligned}

(18)…..\displaystyle  Var(Y)=\frac{77}{48}-0.875^2=\frac{161}{192}=0.8385

Problem 1.4 – the backward conditional distribution

The conditional probability P(Y=y \lvert X=x) is easy to compute since it is a given that Y is a binomial variable conditional on a value of X. Now we want to find the backward probability P(X= x \lvert Y=y). Given the binomial observation is Y=y, what is the probability that the roll of the die is X=x? This is an application of the Bayes’ theorem. We can start by looking at Figure 3 once more.

Consider P(X=x \lvert Y=2). In calculating this conditional probability, we only consider the 5 sample points encircled in Figure 3 and disregard all the other points. These 5 points become a new sample space if you will (this is the essence of conditional probability and conditional distribution). The sum of the joint probability P(X=x,Y=y) for these 5 points is P(Y=2), calculated in the previous step. The conditional probability P(X=x \lvert Y=2) is simply the probability of one of these 5 points as a fraction of the total probability P(Y=2). Thus we have:

(19)…..\displaystyle \begin{aligned} P(X=x \lvert Y=2)&=\frac{P(X=x,Y=2)}{P(Y=2)} \end{aligned}

We do not have to evaluate the components that go into (19). As a practical matter, to find P(X=x \lvert Y=2) is to take each of 5 probabilities shown in (11) and evaluate it as a fraction of the total probability P(Y=2). Thus we have:

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 2 \bold )
(20a)…..\displaystyle \begin{aligned} P(X=2 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{16}}{\displaystyle \frac{3991}{24576}} =\frac{256}{3991} \end{aligned}

(20b)…..\displaystyle \begin{aligned} P(X=3 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{9}{64}}{\displaystyle \frac{3991}{24576}} =\frac{576}{3991} \end{aligned}

(20c)…..\displaystyle \begin{aligned} P(X=4 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{54}{256}}{\displaystyle \frac{3991}{24576}} =\frac{864}{3991} \end{aligned}

(20d)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{270}{1024}}{\displaystyle \frac{3991}{24576}} =\frac{1080}{3991} \end{aligned}

(20e)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=2)&=\frac{\displaystyle \frac{1}{6} \times \frac{1215}{4096}}{\displaystyle \frac{3991}{24576}} =\frac{1215}{3991} \end{aligned}

Here’s the rest of the Bayes’ calculation:

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 0 \bold )
(21a)…..\displaystyle \begin{aligned} P(X=1 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{3}{4}}{\displaystyle \frac{10101}{24576}} =\frac{3072}{10101} \end{aligned}

(21b)…..\displaystyle \begin{aligned} P(X=2 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{9}{16}}{\displaystyle \frac{10101}{24576}} =\frac{2304}{10101} \end{aligned}

(21c)…..\displaystyle \begin{aligned} P(X=3 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{27}{64}}{\displaystyle \frac{10101}{24576}} =\frac{1728}{10101} \end{aligned}

(21d)…..\displaystyle \begin{aligned} P(X=4 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{81}{256}}{\displaystyle \frac{10101}{24576}} =\frac{1296}{10101} \end{aligned}

(21e)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{243}{1024}}{\displaystyle \frac{10101}{24576}} =\frac{972}{10101} \end{aligned}

(21f)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=0)&=\frac{\displaystyle \frac{1}{6} \times \frac{729}{4096}}{\displaystyle \frac{10101}{24576}} =\frac{3729}{10101} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 1 \bold )
(22a)…..\displaystyle \begin{aligned} P(X=1 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{4}}{\displaystyle \frac{9094}{24576}} =\frac{1024}{9094} \end{aligned}

(22b)…..\displaystyle \begin{aligned} P(X=2 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{6}{16}}{\displaystyle \frac{9094}{24576}} =\frac{1536}{9094} \end{aligned}

(22c)…..\displaystyle \begin{aligned} P(X=3 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{27}{64}}{\displaystyle \frac{9094}{24576}} =\frac{1728}{9094} \end{aligned}

(22d)…..\displaystyle \begin{aligned} P(X=4 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{108}{256}}{\displaystyle \frac{9094}{24576}} =\frac{1728}{9094} \end{aligned}

(22e)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{405}{1024}}{\displaystyle \frac{9094}{24576}} =\frac{1620}{9094} \end{aligned}

(22f)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=1)&=\frac{\displaystyle \frac{1}{6} \times \frac{1458}{4096}}{\displaystyle \frac{9094}{24576}} =\frac{1458}{9094} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 2 \bold ) done earlier

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 3 \bold )
(23a)…..\displaystyle \begin{aligned} P(X=3 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{64}}{\displaystyle \frac{1156}{24576}} =\frac{64}{1156} \end{aligned}

(23b)…..\displaystyle \begin{aligned} P(X=4 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{12}{256}}{\displaystyle \frac{1156}{24576}} =\frac{192}{1156} \end{aligned}

(23c)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{90}{1024}}{\displaystyle \frac{1156}{24576}} =\frac{360}{1156} \end{aligned}

(23d)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=3)&=\frac{\displaystyle \frac{1}{6} \times \frac{540}{4096}}{\displaystyle \frac{1156}{24576}} =\frac{540}{1156} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 4 \bold )
(24a)…..\displaystyle \begin{aligned} P(X=4 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{256}}{\displaystyle \frac{211}{24576}} =\frac{16}{211} \end{aligned}

(24b)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{15}{1024}}{\displaystyle \frac{211}{24576}} =\frac{60}{211} \end{aligned}

(24c)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=4)&=\frac{\displaystyle \frac{1}{6} \times \frac{135}{4096}}{\displaystyle \frac{211}{24576}} =\frac{135}{211} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 5 \bold )
(25a)…..\displaystyle \begin{aligned} P(X=5 \lvert Y=5)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{1024}}{\displaystyle \frac{22}{24576}} =\frac{4}{22} \end{aligned}

(25b)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=5)&=\frac{\displaystyle \frac{1}{6} \times \frac{18}{1024}}{\displaystyle \frac{22}{24576}} =\frac{18}{22} \end{aligned}

Calculation of \bold P \bold ( \bold X \bold = \bold x \bold \lvert \bold Y \bold = \bold 6 \bold )
(26)…..\displaystyle \begin{aligned} P(X=6 \lvert Y=6)&=\frac{\displaystyle \frac{1}{6} \times \frac{1}{4096}}{\displaystyle \frac{1}{24576}} =1 \end{aligned}

Problem 2

Let X be the value of one roll of a fair die. If the value of the die is x, we are given that Y \lvert X=x has a binomial distribution with n=x and p=\frac{1}{2} (we use the notation \text{binom}(x,\frac{1}{2}) to denote this binomial distribution).

  1. Compute the conditional binomial distributions Y \lvert X=x where x=1,2,3,4,5,6.
  2. Discuss how the joint probability function P[X=x,Y=y] is computed for x=1,2,3,4,5,6 and y=0,1, \cdots, x.
  3. Compute the marginal probability function of Y and the mean and variance of Y.
  4. Compute P(X=x \lvert Y=y) for all applicable x and y.

Continuations

The practice problems presented here are continued in the next post – calculating covariance and correlation coefficient.

A similar problem is also found in this post.

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

\text{ }

Answers to Probem 2

Problem 2.3

\displaystyle \begin{aligned} P(Y=y): \ \ \ \ &P(Y=0)=\frac{63}{384} \\&\text{ }  \\&P(Y=1)=\frac{120}{384} \\&\text{ } \\&P(Y=2)=\frac{99}{384} \\&\text{ } \\&P(Y=3)=\frac{64}{384} \\&\text{ } \\&P(Y=4)=\frac{29}{384} \\&\text{ } \\&P(Y=5)=\frac{8}{384} \\&\text{ } \\&P(Y=6)=\frac{1}{384} \end{aligned}

\displaystyle E(Y)=\frac{7}{4}=1.75

\displaystyle Var(Y)=\frac{77}{48}

Problem 2.4

\displaystyle \begin{aligned} P(X=x \lvert Y=0): \ \ \ \ &P(X=1 \lvert Y=0)=\frac{32}{63} \\&\text{ }  \\&P(X=2 \lvert Y=0)=\frac{16}{63} \\&\text{ } \\&P(X=3 \lvert Y=0)=\frac{8}{63} \\&\text{ } \\&P(X=4 \lvert Y=0)=\frac{4}{63} \\&\text{ } \\&P(X=5 \lvert Y=0)=\frac{2}{63} \\&\text{ } \\&P(X=6 \lvert Y=0)=\frac{1}{63}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=1): \ \ \ \ &P(X=1 \lvert Y=1)=\frac{32}{120} \\&\text{ }  \\&P(X=2 \lvert Y=1)=\frac{32}{120} \\&\text{ } \\&P(X=3 \lvert Y=1)=\frac{24}{120} \\&\text{ } \\&P(X=4 \lvert Y=1)=\frac{16}{120} \\&\text{ } \\&P(X=5 \lvert Y=1)=\frac{10}{120} \\&\text{ } \\&P(X=6 \lvert Y=1)=\frac{6}{120}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=2): \ \ \ \ &P(X=2 \lvert Y=2)=\frac{16}{99} \\&\text{ } \\&P(X=3 \lvert Y=2)=\frac{24}{99} \\&\text{ } \\&P(X=4 \lvert Y=2)=\frac{24}{99} \\&\text{ } \\&P(X=5 \lvert Y=2)=\frac{20}{99} \\&\text{ } \\&P(X=6 \lvert Y=2)=\frac{15}{99}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=3): \ \ \ \ &P(X=3 \lvert Y=3)=\frac{8}{64} \\&\text{ } \\&P(X=4 \lvert Y=3)=\frac{16}{64} \\&\text{ } \\&P(X=5 \lvert Y=3)=\frac{20}{64} \\&\text{ } \\&P(X=6 \lvert Y=3)=\frac{20}{64}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=4): \ \ \ \ &P(X=4 \lvert Y=4)=\frac{4}{29} \\&\text{ } \\&P(X=5 \lvert Y=4)=\frac{10}{29} \\&\text{ } \\&P(X=6 \lvert Y=4)=\frac{15}{29}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=5): \ \ \ \ &P(X=5 \lvert Y=5)=\frac{2}{8} \\&\text{ } \\&P(X=6 \lvert Y=5)=\frac{6}{8}  \end{aligned}

\displaystyle \begin{aligned} P(X=x \lvert Y=6): \ \ \ \ &P(X=6 \lvert Y=6)=1  \end{aligned}

Dan Ma statistical

Daniel Ma statistical

Dan Ma practice problems

Daniel Ma practice problems

Daniel Ma mathematics

Dan Ma math

Daniel Ma probability

Dan Ma probability

Daniel Ma statistics

Dan Ma statistics

Dan Ma mathematical

Daniel Ma mathematical

\copyright 2012-2019 – Dan Ma