0% found this document useful (0 votes)
2 views

Week3_GA_Solution_2

The document provides a solution to a graded assignment in statistics, focusing on the covariance of discrete random variables X and Y, and a new random variable U defined as their sum. It includes detailed calculations for expected values and probabilities, ultimately determining Cov(X, Y) as 0 and Cov(X, U) as approximately 0.22. Additionally, it evaluates statements regarding a standardized variable Z derived from X, confirming the truth of statements (a), (b), and (d).

Uploaded by

Shobhraj Meena
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Week3_GA_Solution_2

The document provides a solution to a graded assignment in statistics, focusing on the covariance of discrete random variables X and Y, and a new random variable U defined as their sum. It includes detailed calculations for expected values and probabilities, ultimately determining Cov(X, Y) as 0 and Cov(X, U) as approximately 0.22. Additionally, it evaluates statements regarding a standardized variable Z derived from X, confirming the truth of statements (a), (b), and (d).

Uploaded by

Shobhraj Meena
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Statistics for Data Science - II

Week-3 Graded Assignment Solution


Jan 2025 Term
Week-3
1. The joint PMF of two discrete random variables X and Y is given as:

Y \X −1 0
1
−1 k 6
1 1
0 6 3
1
1 k 6

(i) Find the Cov(X, Y ):

Answer : 0

Explanation:
We are given the joint probability mass function (PMF) of two discrete
random variables X and Y :
Y \X −1 0
1
−1 k 6
1 1
0 6 3
1
1 k 6

Finding Cov(X, Y ) The covariance formula is:

Cov(X, Y ) = E[XY ] − E[X]E[Y ]

Let’s determine k first.


Step 1: Use the Total Probability Rule Since the given table represents a joint
probability mass function (PMF), the sum of all probabilities must be 1:

P (X, Y ) = 1

From the given table:


1 1 1 1
k+ + + +k+ =1
6 6 3 6

1
1
Step 2: Build the Table with k Now, substituting k = 12 into the original
table:
Y \X −1 0
1 1
−1 12 6
1 1
0 6 3
1 1
1 12 6

This is the correct joint probability table with k substituted. Step 3:


Compute E[X] X
E[X] = xP (X = x)
x

Using marginal probabilities P (X) from the table:


1 2
E[X] = (−1) · + (0) · + (1) · 0
3 3

1
E[X] = −
3

Step 4: Compute E[Y ] X
E[Y ] = yP (Y = y)
y

Using marginal probabilities P (Y ):


1 1 1
E[Y ] = (−1) · + (0) · + (1) ·
4 2 4

1 1
E[Y ] = − + = 0
4 4

Step 5: Compute E[XY ]
XX
E[XY ] = xyP (X = x, Y = y)
x y

Using joint probabilities:


1 1 1
E[XY ] = (−1)(−1) · + (−1)(0) · + (−1)(1) ·
12 6 12

1 1 1
+(0)(−1) · + (0)(0) · + (0)(1) ·
6 3 6

+(1)(−1) · 0 + (1)(0) · 0 + (1)(1) · 0

2
 
1 1
= +0− + (0 + 0 + 0) + (0 + 0 + 0) = 0
12 12

Thus, E[XY ] = 0.
Step 6: The covariance formula is:

Cov(X, Y ) = E[XY ] − E[X]E[Y ]

Plugging in the computed values:


 
1
Cov(X, Y ) = 0 − − × 0 = 0
3

Cov(X, Y ) = 0

— (ii) Define a new random variable U = X + Y . Find the value of


Cov(X, U ). Enter the answer correct to two decimal places.

Answer : 0.22 ; Range : 0.19 to 0.25:

Explanation:
Step 1: Define the New Random Variable U = X + Y We need to find the
probability mass function (PMF) of U = X + Y . To do this, we compute the
possible values of U and their corresponding probabilities by summing X and
Y for each joint probability.
Possible Values of U Since X takes values {−1, 0} and Y takes values
{−1, 0, 1}, the possible values of U = X + Y are:

U ∈ {−2, −1, 0, 1}

Step 2: Compute P (U ) for Each Value of U


We sum the probabilities where U = X + Y for each case.
- P (U = −2) - U = −2 occurs when (X, Y ) = (−1, −1) - Probability:
1
P (U = −2) = P (X = −1, Y = −1) =
12
- P (U = −1) - U = −1 occurs when (X, Y ) = (−1, 0) or (X, Y ) = (0, −1) -
Probability:

P (U = −1) = P (X = −1, Y = 0) + P (X = 0, Y = −1)

3
1 1 2 1
= + = =
6 6 6 3
- P (U = 0) - U = 0 occurs when
(X, Y ) = (−1, 1), (X, Y ) = (0, 0), (X, Y ) = (1, −1) - Probability:

P (U = 0) = P (X = −1, Y = 1) + P (X = 0, Y = 0) + P (X = 1, Y = −1)
1 1 1 4 5
= + +0= + =
12 3 12 12 12
- P (U = 1) - U = 1 occurs when (X, Y ) = (0, 1) or (X, Y ) = (1, 0) -
Probability:

P (U = 1) = P (X = 0, Y = 1) + P (X = 1, Y = 0)
1 1
= +0=
6 6

Step 3: Construct the PMF Table for U
U P (U )
1
−2 12
1
−1 3
5
0 12
1
1 6

This is the PMF of U = X + Y . Now we can proceed to computing Cov(X, U ).


To calculate the expectation E[U ], we use the formula:
X
E[U ] = u · P (U = u)
u

Substituting the values from the PMF of U :


1 1 5 1
E[U ] = (−2) · + (−1) · + 0 · +1· +2·0
12 3 12 6
Let’s compute this.
The expectation of U , E[U ], is approximately
1
3
. Table for Distribution X, Y, U and P(X,Y)

X Y U =X +Y P (X, Y )
1
−1 −1 −2 12
1
0 −1 −1 6
1
−1 0 −1 6
1
0 0 0 3
1
−1 1 0 12
1
0 1 1 6

4
We compute the covariance between X and U by first finding the expectation
of the product of X and U , then subtracting the product of the means. In
formula form,

Cov(X, U ) = E [XU ] − E[X] E[U ].


Given E[X] = − 13 and E[U ] = − 13 , we first calculate E[XU ] using the table:
1
- For (X, Y ) = (−1, −1) with U = −2 and probability 12 :
1 1 1
F or(X = −1, U = −2) = (−1)(−2) × =2× = .
12 12 6
- For (0, −1) with U = −1 and probability 16 :
1
F or(X = 0, U = −1) = 0 × (−1) × = 0.
6
- For (−1, 0) with U = −1 and probability 61 :
1 1 1
F or(X = −1, U = −1) = (−1)(−1) × =1× = .
6 6 6
- For (0, 0) with U = 0 and probability 31 :
1
F or(X = 0, U = 0) = 0 × 0 × = 0.
3
1
- For (−1, 1) with U = 0 and probability 12 :

1
F or(X = −1, U = 0) = (−1)(0) × = 0.
12
- For (0, 1) with U = 1 and probability 61 :
1
F or(X = 0, U = 1) = 0 × 1 × = 0.
6
Summing these up, we get
1 1 2 1
E[XU ] = + = = .
6 6 6 3
Next, substitute into the covariance formula:
  
1 1 1 1 1 3 1 2
Cov(X, U ) = − − − = − = − = .
3 3 3 3 9 9 9 9
Numerically,
2
≈ 0.22.
9
2. Let X be a random variable with mean µ = 10 and standard deviation
σ = 8. Let Z be the centralized and normalized version of X. Which of the
following statement(s) is(are) true?

5
(a) E[Z] = 0
(b) V ar(Z) = 1
(c) Using Markov’s inequality, P (Z ≥ 2) ≤ 0.25
(d) Using Markov’s inequality, P (X ≥ 64) ≤ 0.16
(e) P (Z > 2) = 0.25
Answer : a,b,d

Explanation:

Given: - X has mean µ = 10 and standard deviation σ = 8. - Z is the


standardized version:
X −µ
Z=
σ
Now, let’s check the given statements:
(a) E[Z] = 0 - Since Z is standardized, its expected value is:
 
X −µ E[X] − µ µ−µ
E[Z] = E = = =0
σ σ σ
- True.
(b) V ar(Z) = 1 - The variance of Z is:
σ2
 
X −µ V ar(X)
V ar(Z) = V ar = 2
= 2 =1
σ σ σ
- True.
(c) Markov’s inequality applies to nonnegative random variables, but Z (which
is standardized as Z = X−µ
σ ) can take negative values. Therefore, applying
Markov’s inequality directly to Z is not valid in its standard form.
Thus, the statement ”Using Markov’s inequality, P (Z ≥ 2) ≤ 0.25” is false
because the inequality does not hold for variables that can take negative
values.
- False
(d) Markov’s inequality: P (X ≥ 64) ≤ 0.16 - Applying Markov’s inequality:
E[X] 10
P (X ≥ 64) ≤ = = 0.15625 ≈ 0.16
64 64
- True.
(e) P (Z > 2) = 0.25 - This is incorrect since normal distribution probabilities
must be computed using the standard normal table, and P (Z > 2) is
approximately 0.0228, not 0.25.
- False.

6
Final Correct Answers: (a), (b), (d)

You might also like