Joint Continuous Random Variables
Joint Continuous Random Variables
Z
P (a X b) =
f (x) dx.
a
Just as in the discrete case, we can extend this concept to the case where we consider the joint probability
of two continuous random variables. Let X and Y be two continuous random variables. Now an event for
both random variables might be something of the form: {a X b} {c Y d}, meaning the pair
(X, Y ) fell inside the box [a, b] [c, d]. The joint pdf for X and Y is a function f (x, y) satisfying
1. f (x, y) 0, for all x, y
R R
2. f (x, y) dx dy = 1
3. P (a X b, c Y d) =
RdRb
c
f (x, y) dx dy
Double Integrals. Computing probabilities of events for joint random variables requires double integrals
like the one in rule #3 above. Double integrals are not that scary (if you can integrate once, you can integrate
twice!). Here is the procedure for evaluating the integral
Z
dZ b
P (a X b, c Y d) =
f (x, y) dx dy.
c
Rb
1. First evaluate the inside integral: a f (x, y) dx. Treat the variable y as if it were a constant (like
you would the number 2). Evaluating the integral over the interval [a, b] will result in a function F (y),
where the variable x does not appear.
Rd
2. Next evaluate the outside integral: c F (y) dy. This will result in the answer you are looking for.
Example: Consider random variables X, Y with joint probability density function:
(
2y sin(x)
for 0 x 2 , 0 y 1
f (x, y) =
0
otherwise
What is P (0 X 4 , 0.5 Y 1)?
Lets go through the two steps above to evaluate this probability as a double integral:
P (0 X 4 , 0.5 Y 1) =
Z 1 Z /4
2y sin(x) dx dy
=
0.5 0
Z 1
x=/4
=
2y cos(x)
dy
x=0
0
Z 1
=
2y
2/2 1 dy
0.5
y=1
2
= y
2/2 1
y=0.5
1
= 1
2/2 1
4
63 2
=
0.2197
8
Marginal Probabilities. Remember that for joint discrete random variables, the process of marginalizing
one of the variables just means to sum over it. For continuous random variables, we have the same process,
just replace a sum with an integral. So, to get the pdf for X or the pdf for Y from the joint pdf f (x, y), we
just integrate out the other variable:
Z
Z
fX (x) =
f (x, y) dy, and fY (y) =
f (x, y) dx.
Z
fY (y) =
0
y=0
Z
f (x, y) dx =
0
x=
2
2y sin(x) dx = 2y cos(x)
= 2y
x=0
Conditional Probability. Conditional probability works much like the discrete case. For random variables X, Y with joint pdf f (x, y) and marginal pdfs fX (x) and fY (y), we define the conditional density
function:
( f (x,y)
for all values of y where fY (y) 6= 0
f (x|Y = y) = fY (y)
0
otherwise
Now, conditional probabilities are found by integrating f (x|Y = y):
Z b
P (a X b | Y = y) =
f (x | Y = y) dx.
a
Example: Again, using the joint pdf in the example above, what is the conditional density f (x | Y = y)?
We just need to use the formula we found above for fY (y):
f (x | Y = y) =
f (x, y)
2y sin(x)
=
= sin(x) (for 0 x /2, 0 otherwise).
fY (y)
2y
2
Notice that this turned out to be just fX (x). Lets try a more interesting example.
In-class Exercise: Given the joint density of X, Y :
(
x2 + 43 xy + y 2
f (x, y) =
0
What are the marginal densities fX (x), fY (y), and what are the conditional densities f (x | Y = y) and
f (y | X = x)?
Independence. Again, independence is just the same as in the discrete case, we just have pdfs instead of
pmfs. The three equivalent definitions for independence of X and Y are:
For all possible x R and y R:
f (x, y) = fX (x)fY (y)
f (x | Y = y) = fX (x)
f (y | X = x) = fY (y)
In-class Exercise: For the two joint densities in the examples above, determine if X and Y are independent.