0% found this document useful (0 votes)
41 views

NPT19 jointRV2

Conditional distributions can be defined for both discrete and continuous random variables. For discrete random variables, the conditional probability mass function pY|X(y|x) gives the probability of Y=y given X=x. For continuous random variables, the conditional density function fY|X(y|x) is defined in the limiting sense. Bayes' rule allows calculating the posterior probability distribution or density given the joint distribution or density and marginal distributions or densities. It can be applied to both discrete and continuous random variables as well as mixtures.

Uploaded by

Ceyhun Eyigün
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

NPT19 jointRV2

Conditional distributions can be defined for both discrete and continuous random variables. For discrete random variables, the conditional probability mass function pY|X(y|x) gives the probability of Y=y given X=x. For continuous random variables, the conditional density function fY|X(y|x) is defined in the limiting sense. Bayes' rule allows calculating the posterior probability distribution or density given the joint distribution or density and marginal distributions or densities. It can be applied to both discrete and continuous random variables as well as mixtures.

Uploaded by

Ceyhun Eyigün
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 6

Conditional Distributions

We discussed the conditional CDF and conditional PDF of a random variable conditioned
on some events defined in terms of the same random variable. We observed that
P { X x} B
FX ( x / B ) = P ( B ) 0
P ( B)
and
d
fX ( x / B) = FX ( x / B )
dx
We can define these quantities for two random variables. We start with the conditional
probability mass functions of two random variables.

Conditional probability mass functions

Suppose X and Y are two discrete jointly random variable with the joint PMF
p X ,Y ( x, y ). The conditional PMF of Y given X = x is defined as
pY / X ( y / x) = P ({Y = y}/{ X = x})
P ({ X = x} {Y = y})
=
P{ X = x}
p ( x, y )
= X ,Y provided p X ( x ) 0
p X ( x)
Similarly we can define the conditional probability mass function
pX / Y ( x / y)

Remarks

The conditional PMF satisfies the properties of the probability mass


functions.
From the definition of conditional probability mass functions, we can
define two independent random variables. Two discrete random variables
X and Y are said to be independent if and only if
"(x,y) RX RY ,
pY / X ( y / x) = pY ( y )
so that
p X ,Y ( x, y ) = p X ( x) pY ( y )

Bayes rule for discrete random variables


Suppose X and Y are two discrete jointly random variables. Given p X ( x) and
pY / X ( y / x) we can determine the a posteriori probability mass function p X / Y ( x / y ) by
using the Bayes rule as follows:
p X / Y ( x / y ) = P({ X = x}/{Y = y})
P ({ X = x} {Y = y})
=
P{Y = y}
p X ,Y ( x, y )
=
pY ( y )
p X ,Y ( x, y )
= ( Using the total probability law)
p X ,Y ( x , y)
x
RX

Example Consider the random variables X and Y with the joint probability mass
function as presented in the following table

X 0 1 2 pY ( y )
Y
0 0.25 0.1 0.15 0.5
1 0.14 0.35 0.01 0.5
p X ( x) 0.39 0.45
The marginal probabilities are as shown in the last column and the last row

p X ,Y (0,1)
pY / X (0 /1) =
p X (1)
0.14
=
0.39

Conditional Probability Distribution function

Consider two continuous jointly random variables X and Y with the joint
probability distribution function FX ,Y ( x, y ). We are interested to find the conditional
distribution of one of the random variables on the condition of a particular value of the
other random variable.

We cannot define the conditional distribution of the random variable Y on the


condition of the event { X = x} by the relation

F ( y / x ) = P (Y y / X = x )
Y/X

P(Y y , X = x )
=
P( X = x)
as P( X = x) = 0 in the above expression. The conditional distribution function is defined
in the limiting sense as follows:

F ( y / x) = limDx 0 P (Y y / x < X x + Dx)


Y/X

P(Y y, x < X x + Dx)


=limDx 0
P ( x < X x + Dx )
y
f X ,Y ( x, u ) Dxdu

=limDx 0
f X ( x )Dx
y
f X ,Y ( x, u ) du
=
f X ( x)

Conditional Probability Density Function

f Y / X ( y / X = x) = f Y / X ( y / x) is called conditional density of Y given X .


Let us define the conditional distribution function .

The conditional density is defined in the limiting sense as follows


f Y / X ( y / X = x) = lim Dy 0 ( FY / X ( y + Dy / X = x) FY / X ( y / X = x )) / Dy
= lim Dy 0, Dx0 ( FY / X ( y + Dy / x < X x + Dx) FY / X ( y / x < X x + Dx )) / Dy (1
Because ( X = x) = lim Dx0 ( x < X x + Dx)

The right hand side in equation (1) is


lim Dy 0,Dx 0 ( FY / X ( y + Dy / x < X < x + Dx) FY / X ( y / x < X < x + Dx )) / Dy
= lim Dy 0,Dx 0 ( P( y < Y y + Dy / x < X x + Dx)) / Dy
= lim Dy 0,Dx 0 ( P( y < Y y + Dy, x < X x + Dx)) / P( x < X x + Dx)Dy
= lim Dy 0,Dx 0 f X ,Y ( x, y ) DxDy / f X ( x) DxDy
= f X ,Y ( x, y ) / f X ( x)

\ fY / X ( y / x ) = f X ,Y ( x, y ) / f X ( x )
(2)

Similarly we have

\ f X / Y ( x / y ) = f X ,Y ( x, y ) / f Y ( y )
(3)

2
Two random variables are statistically independent if for all ( x, y ) ,
fY / X ( y / x ) = fY ( y )
or equivalently
f X ,Y ( x, y) = f X ( x) fY ( y) (4)

Bayes rule for continuous random variables:

From (2) and (3) we get the Bayes rule for two continuous joint random varibles
f X ,Y ( x, y )
f X / Y ( x / y) =
fY ( y )
f X ( x ) fY / X ( y / x )
\ f X /Y ( x / y) =
fY ( y )
f X ,Y ( x, y )
=
f X ,Y ( x, y )dx

fY / X ( y / x ) f X ( x ) (4)
=
f X (u ) fY / X ( y / x)du

Given the joint density function we can find out the conditional density function.

Example: For random variables X and Y, the joint probability density function is given
by
1 + xy
f X ,Y ( x, y ) = x 1, y 1
4
= 0 otherwise
Find the marginal density f X ( x), fY ( y ) and fY / X ( y / x). Are X and Y independent?

1
1 + xy
f X ( x) = 4
1
dy

1
=
2
Similarly
1
fY ( y ) = -1 y 1
2
and
f X ,Y ( x, y )
fY / X ( y / x ) =
f X ( x)

1 + xy
= fY ( y )
2
Hence, X and Y are not independent.

Bayes Rule for mixed random variables

Let X be a discrete random variable with probability mass function p X ( x) and Y be a


continuous random variable defined on the same sample space with the conditional
probability density function fY / X ( y / x). In practical problems we may have to estimate
the conditional PMF of X given the observed value Y . = y We can define this
conditional PMF also in the limiting sense

pX /Y ( x / y ) = lim Dy 0 P ( X = x / y < Y y + Dy )
P ( X = x, y < Y y + Dy )
= lim Dy 0
P ( y < Y y + Dy )
p X ( x ) fY / X ( y / x ) Dy
= lim Dy 0
f Y ( y ) Dy
p X ( x ) fY / X ( y / x )
=
fY ( y )
p X ( x ) fY / X ( y / x )
=
p X ( x ) fY / X ( y / x )
x

Example

X
+ Y

X is a binary random variable with


1 with probability p
X =
1 with probability 1-p
V is the Gaussian noise with mean
0 and variance s .2
Then
p X ( x ) fY / X ( y / x )
p X / Y ( x = 1/ y ) =
p X ( x) fY / X ( y / x)
x
( y 1)2 / 2s 2
pe
=
( y 1) 2 / 2s 2 2
/ 2s 2
pe + (1 p)e ( y +1)

Independent Random Variables

Let X and Y be two random variables characterized by the joint distribution function

FX ,Y ( x, y ) = P{ X x, Y y}

and the corresponding joint density function f X ,Y ( x , y ) = 2


xy
FX ,Y ( x, y )

Then X and Y are independent if "( x, y ) 2 , { X x} and {Y y} are


independent events. Thus,

FX ,Y ( x, y ) = P{ X x, Y y}
=P{ X x}P{Y y}
= FX ( x) FY ( y )
2 FX ,Y ( x, y )
\ f X ,Y ( x, y ) =
x y
= f X ( x ) fY ( y )

and equivalently
fY / X ( y) = fY ( y)

You might also like