0% found this document useful (0 votes)
6 views

Lecture 5

1. The document discusses joint probability distributions for two random variables X and Y. 2. It provides examples of discrete joint probability distributions represented by tables showing the probabilities f(x,y) for all combinations of values of X and Y. 3. The document also discusses how to calculate marginal distributions from a joint distribution and determines whether random variables are independent.

Uploaded by

john phol belen
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Lecture 5

1. The document discusses joint probability distributions for two random variables X and Y. 2. It provides examples of discrete joint probability distributions represented by tables showing the probabilities f(x,y) for all combinations of values of X and Y. 3. The document also discusses how to calculate marginal distributions from a joint distribution and determines whether random variables are independent.

Uploaded by

john phol belen
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

JOINT

PROBABILITY
DISTRIBUTIONS
INTENDED LEARNING OUTCOMES
1. Understand and use joint probability mass functions and joint
probability density functions to calculate probabilities and
calculate marginal probability distributions.

2. Understand and calculate conditional probability


distributions from joint probability distributions and assess
independence of random variables.

3. Calculate means and variances for linear functions of random


variables and calculate probabilities for linear functions of
normally distributed random variables.

4. Determine the distribution of general function of a random


variable.
In the previous section, we studied probability distributions for
a single random variable. There will be situations, however, where
we may find it desirable to record the simultaneous outcomes of
several random variables.
For example,
a.we might measure the amount of precipitate P and volume V of gas
released from a controlled chemical experiment, giving rise to a two
dimensional sample space consisting of the outcomes (p, v)
b. we might be interested in the hardness H and tensile strength T of
cold-drawn copper, resulting in the outcomes (h, t)
If X and Y are two discrete random variables, the probability
distribution for their simultaneous occurrence can be represented by
a function with values f(x, y) for any pair of values (x, y) within
the range of the random variables X and Y. It is customary to refer
to this function as the joint probability distribution of X and Y.
Hence, in the discrete case,

f (x, y) = P (X = x, Y = y)

that is, the values (x, y) give the probability that


outcomes x and y occur at the same time.

Discrete case.

The function f ( x, y) is a joint probability distribution


or probability mass function of the discrete random
variables X and Y if

1.𝑓(𝑥,𝑦)≥0 𝑓𝑜𝑟 𝑎𝑙𝑙 (𝑥,𝑦)

2. ΣΣ𝑓(𝑥,𝑦)=1𝑦𝑥

3. 𝑃(𝑋=𝑥,𝑌=𝑦)=𝑓(𝑥,𝑦)
The two table below shows a joint distribution of two random
variables : X and Y .
Values of Y
1 2 3 4
1 6c 3c 2c 4c
Values of X 2 4c 2c 4c 0
3 2c c 0 2c

a.Find c.
b.Find the marginal distribution of x and y
i. The sum of all 12 table entries is 30c . These
probabilities must add up to 1 , so c = 1/30.
Values of Y
1 2 3 4 Total
1 6c 3c 2c 4c 15c
C =
1/30
Values of 2 4c 2c 4c 0 10c
X 3 2c c 0 2c 5c
Total 12c 6c 6c 6c 30c =1
Values of Y
1 2 3 4 Total
1 6() = 3() = 2() = 4() = 15() =
2 4() = 2() = 4() = 0 10() =
Values of X
3 2() = () = 0 2() = 5() =
Total 12() = 6() = 6() = 6() = 30() =1

ii. Marginal distributions are given by the row and column totals.
Hence: Similarly:
P (Y=1) = 12c = 12/30 = 2/5
P (X=1) = 15c = 1/2
P (Y=2) = 6c = 6/30 = 1/5
P (X=2) = 10c = 1/3
P (Y=3) = 6c = 6/30 = 1/5
P (X=3) = 5c = 1/6
P (Y=4) = 6c = 6/30 = 1/5
B. Calculate E(X) and Var(X) , and show that the covariance cov (X,Y) = 0

Expected values of X and X^2

x 1 2 3 y 1 2 3 4
P (Y= y) 2/5 1/5 1/5 1/5
P (X= x) 1/2 1/3 1/6

E (X) = ( 1 x + ( 2 x + ( 3 x E (Y) = ( 1 x + ( 2 x + ( 3 x + ( 4 x

E(Y) =
E(X) =

E (X^2) = ( 1 x + ( 4 x + ( 9 x

E(X) =

Var (X) = E(x^2) – (E(x))^2


Var (x) = 10/3 – (5/3)^2
Var (x) = 5/9
Cov ( X,Y) = E ( XY) – E (X) E (Y) Distribution of XY

Cov ( X,Y) = 11/3 – 11/3 Values of Y


Values of XY
1 2 3 4 Total
Cov ( X,Y) =0
1 1= 2= 3= 4=
Values
2 2= 4= 6= 8=0
of X
3 3= 6= 9=0 12=
Probability Distribution of XY:
Probability
Values of XY XY x P (XY) E (XY) = or
P(XY)
1
2 Also we have,

3 E(X) E (Y) = 5/3 x 11/5


4 E(X) E (Y) = 55/15 = 11/3
6
12
Total
Example 1. Two ballpoint pens are selected at random from a box that contains 3 blue
pens, 2 red pens, and 3 green pens. If X is the number of blue pens selected and Y
is the number of red pens selected, find

Blue Red
a.) the joint probability function f(x, y).
1 B1B2 2 0 b.) 𝑃[(𝑋,𝑌)∈𝐴] where A is the
2 B1B3 2 0 region{(𝑥,𝑦)|𝑥+𝑦≤1}.
3 B1R1 1 1
4 B1R2 1 1
5 B1G1 1 0 Values of BLUE
6 B1G2 1 0
7 B1G3 1 0 0 1 2 Total
8 B2B3 2 0
9 B2R1 1 1 0 3c 9c 3c 15c
10 B2R2 1 1
11 B2G1 1 0 1 6c 6c 0 12c
12 B2G2 1 0 Values of
2 1c 0 0 1c
13
14
B2G3
B3R1
1
1
0
1
RED
15 B3R2 1 1 Total 10c 15c 3c 28c =1
16 B3G1 1 0
17 B3G2 1 0
18 B3G3 1 0
19 R1R2 0 2
20 R1G1 0 1
P (B=0) = 15/28 P (R=0) = 10/28
21 R1G2 0 1
22 R1G3 0 1 P (B=1) = 12/28 = 3/7 P (R=1) = 15/28
23 R2G1 0 1
24 R2G2 0 1 P (B=2) = 1/28 P (R=2) = 3/28
25 R2G3 0 1
26 G1G2 0 0
27 G1G3 0 0
28 G2G3 0 0
X = BLUE 0 1 2
P (X= x)

E (X) = ( 0 x + ( 1 x + ( 2 x

E(X) =

Y = RED 0 1 2
P (Y= y)

E (Y) = ( 0 x + ( 1 x + ( 2 x

E(Y) =
(b) The probability that (X, Y) fall Distribution of XY
in the region A is Values of BLUE
Values of XY
0 1 2 Total
𝑃[(𝑋,𝑌)∈𝐴]= 𝑃(𝑋+𝑌≤1)
0 0 =3/28 0=9/28 0= 3/28
=𝑓(0,0)+𝑓(0,1)+𝑓(1,0)
Values of
1 0= 6/28 1=6/28 2=0
RED
=3/28+3/14+9/28
=9/14 2 0= 1/28 2= 0 4=0

Probability Distribution of XY:


Probability
Values of XY XY x P (XY) E (XY) = or 3/14
P(XY)
0 22/28 0
1 6/28 6/28
2 0 0
4 0 0
Total 6/28
Continuous case.
The case where both variables are continuous is obtained
easily by analogy with discrete case on replacing sums by
integrals. Thus, the joint probability function for the
random variables X and Y (or, as it is more commonly called,
the joint density function of X and Y). The function f(x, y)
is a joint density function of the continuous random
variables X and Y if
1.𝑓(𝑥, 𝑦) ≥ 0 𝑓𝑜𝑟 𝑎𝑙𝑙 (𝑥, 𝑦)

2.2. ∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = 1

3.3.𝑃[(𝑋, 𝑌) ∈ 𝐴] = ∬ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 𝐴 , for any region A in


the xy plane
Example: Given the joint probability that
f(x,y) = 4xy ; 0<x<1 , 0<y<1
0 otherwise
Find the joint probability distribution of this two variable x and
y.
𝟏 𝟏

∫∫ 𝒇 ( 𝒙 , 𝒚 ) ⅆ 𝒙 ⅆ 𝒚 =∫ ∫ 𝟒 𝒙𝒚 ⅆ 𝒙 ⅆ
𝟏
𝒚¿ 𝟒 ∫
𝟐 𝟎
𝒙ⅆ 𝒙
𝒙 𝒚 𝟎 𝟎

|
𝟏 𝟏 𝟐 𝟏
𝒙
¿ 𝟒 ∫ ∫ 𝒙𝒚 ⅆ 𝒚 ⅆ¿ 𝟐𝒙
𝟐 𝟎
𝟎 𝟎

( )
| ( )
𝟏 𝟐 𝟏 𝟏𝟐 𝟎𝟐
𝒚 ¿𝟐 −
¿ 𝟒∫ 𝒙 ⅆ 𝒙 𝟐 𝟐
𝟐 𝟎
( )
𝟎 𝟏
¿𝟐 =𝟏
𝟏 𝟐
¿ 𝟒∫ 𝒙
𝟎
𝟏
𝟐 (
−𝟎 ⅆ 𝑿 )

You might also like