Lecture #8: Functions of Multivariate Random Variables
Lecture #8: Functions of Multivariate Random Variables
U
V
and Y = V
1.
@x
v 1
@v
=
0
@y
@v
uv
1
= v 1.
u
e
v
dv = ue
1
u
= 2e
|v|
v
1
1
1
dv = ue
v2
for u > 0 (which happens to be the density function of a (2, 1) random variable). As an
added bonus, we have also shown that X(1 + Y ) and (1 + Y ) are independent.
Example. Suppose that the random vector (X, Y )0 has joint density function
(
e y , if 0 < x < y < 1,
fX,Y (x, y) =
0,
otherwise.
Determine the density function of the random variable X + Y .
Solution. If U = X + Y and V = Y , then solving for X and Y gives
X=U
and Y = V.
81
@x
1
@v
=
0
@y
@v
1
= 1.
1
Therefore, we conclude
v, v) |J| = e
1=e
()
provided that 0 < v < u < 2v < 1 (or, equivalently, u2 < v < u). The marginal for U is
given by
Z 1
Z u
u
fU (u) =
fU,V (u, v) dv =
e v dv =
e v
= e u/2 e u , u > 0.
1
u/2
u/2
In particular, note that U and V are NOT independent, even though () suggests that they
are. The reason, of course, is because of the dependence of u on v in the limits of integration.
Example. The purpose of this example is to prove a relationship between the gamma function and the beta integral that was used last lecture. That is, show that if a > 0 and b > 0,
then
Z 1
(a) (b)
xa 1 (1 x)b 1 dx =
.
(a + b)
0
Proof. The solution is to consider the product (a) (b) as a double integral, change variables,
and simplify. That is,
Z 1
Z 1
Z 1Z 1
a 1
x
b 1
y
(a) (b) =
x e dx
y e dy =
xa 1 y b 1 e (x+y) dx dy.
0
Let u = x + y and v =
transformation is
x
x+y
1
0
a 1 b 1
implying
Z
(x+y)
v
1
dx dy =
u
=
u
0
Z
(uv)a 1 (u(1
v))b 1 e u u du dv
Z 1
1
u
e du
v a 1 (1 v)b 1 dv
ua+b
0
Z
= (a + b)
=
as required.
82
v a 1 (1
0
v)b
dv