NPT19 jointRV2
NPT19 jointRV2
We discussed the conditional CDF and conditional PDF of a random variable conditioned
on some events defined in terms of the same random variable. We observed that
P { X x} B
FX ( x / B ) = P ( B ) 0
P ( B)
and
d
fX ( x / B) = FX ( x / B )
dx
We can define these quantities for two random variables. We start with the conditional
probability mass functions of two random variables.
Suppose X and Y are two discrete jointly random variable with the joint PMF
p X ,Y ( x, y ). The conditional PMF of Y given X = x is defined as
pY / X ( y / x) = P ({Y = y}/{ X = x})
P ({ X = x} {Y = y})
=
P{ X = x}
p ( x, y )
= X ,Y provided p X ( x ) 0
p X ( x)
Similarly we can define the conditional probability mass function
pX / Y ( x / y)
Remarks
Example Consider the random variables X and Y with the joint probability mass
function as presented in the following table
X 0 1 2 pY ( y )
Y
0 0.25 0.1 0.15 0.5
1 0.14 0.35 0.01 0.5
p X ( x) 0.39 0.45
The marginal probabilities are as shown in the last column and the last row
p X ,Y (0,1)
pY / X (0 /1) =
p X (1)
0.14
=
0.39
Consider two continuous jointly random variables X and Y with the joint
probability distribution function FX ,Y ( x, y ). We are interested to find the conditional
distribution of one of the random variables on the condition of a particular value of the
other random variable.
F ( y / x ) = P (Y y / X = x )
Y/X
P(Y y , X = x )
=
P( X = x)
as P( X = x) = 0 in the above expression. The conditional distribution function is defined
in the limiting sense as follows:
\ fY / X ( y / x ) = f X ,Y ( x, y ) / f X ( x )
(2)
Similarly we have
\ f X / Y ( x / y ) = f X ,Y ( x, y ) / f Y ( y )
(3)
2
Two random variables are statistically independent if for all ( x, y ) ,
fY / X ( y / x ) = fY ( y )
or equivalently
f X ,Y ( x, y) = f X ( x) fY ( y) (4)
From (2) and (3) we get the Bayes rule for two continuous joint random varibles
f X ,Y ( x, y )
f X / Y ( x / y) =
fY ( y )
f X ( x ) fY / X ( y / x )
\ f X /Y ( x / y) =
fY ( y )
f X ,Y ( x, y )
=
f X ,Y ( x, y )dx
fY / X ( y / x ) f X ( x ) (4)
=
f X (u ) fY / X ( y / x)du
Given the joint density function we can find out the conditional density function.
Example: For random variables X and Y, the joint probability density function is given
by
1 + xy
f X ,Y ( x, y ) = x 1, y 1
4
= 0 otherwise
Find the marginal density f X ( x), fY ( y ) and fY / X ( y / x). Are X and Y independent?
1
1 + xy
f X ( x) = 4
1
dy
1
=
2
Similarly
1
fY ( y ) = -1 y 1
2
and
f X ,Y ( x, y )
fY / X ( y / x ) =
f X ( x)
1 + xy
= fY ( y )
2
Hence, X and Y are not independent.
pX /Y ( x / y ) = lim Dy 0 P ( X = x / y < Y y + Dy )
P ( X = x, y < Y y + Dy )
= lim Dy 0
P ( y < Y y + Dy )
p X ( x ) fY / X ( y / x ) Dy
= lim Dy 0
f Y ( y ) Dy
p X ( x ) fY / X ( y / x )
=
fY ( y )
p X ( x ) fY / X ( y / x )
=
p X ( x ) fY / X ( y / x )
x
Example
X
+ Y
Let X and Y be two random variables characterized by the joint distribution function
FX ,Y ( x, y ) = P{ X x, Y y}
FX ,Y ( x, y ) = P{ X x, Y y}
=P{ X x}P{Y y}
= FX ( x) FY ( y )
2 FX ,Y ( x, y )
\ f X ,Y ( x, y ) =
x y
= f X ( x ) fY ( y )
and equivalently
fY / X ( y) = fY ( y)