0% found this document useful (0 votes)
24 views

Lecture #8: Functions of Multivariate Random Variables

1) The document discusses a lecture on functions of multivariate random variables. It provides two examples: 2) The first example determines the distribution of X(1+Y) where (X,Y) have a joint density function. Through a variable transformation, it is shown that X(1+Y) and (1+Y) are independent. 3) The second example determines the density function of X+Y where (X,Y) have a different joint density function. Through another variable transformation, the density function of X+Y is obtained.

Uploaded by

mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Lecture #8: Functions of Multivariate Random Variables

1) The document discusses a lecture on functions of multivariate random variables. It provides two examples: 2) The first example determines the distribution of X(1+Y) where (X,Y) have a joint density function. Through a variable transformation, it is shown that X(1+Y) and (1+Y) are independent. 3) The second example determines the density function of X+Y where (X,Y) have a different joint density function. Through another variable transformation, the density function of X+Y is obtained.

Uploaded by

mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Statistics 351 (Fall 2015)

Prof. Michael Kozdron

September 25, 2015

Lecture #8: Functions of Multivariate Random Variables


Example (Chapter 1, Problem #31). Suppose that (X, Y )0 have joint density
(
xe x xy for x > 0, y > 0,
f (x, y) =
0,
otherwise.
Determine the distribution of X(1 + Y ).
Solution. Let U = X(1 + Y ) and V = 1 + Y so that solving for X and Y gives
X=

U
V

and Y = V

1.

The Jacobian of this transformation is given by


@x
@u
J=
@y
@u

@x
v 1
@v
=
0
@y
@v

uv
1

= v 1.

The density of (U, V )0 is therefore given by


1) |J| =

fU,V (u, v) = fX,Y (u/v, v

for u > 0 and v > 1. The marginal for U is


Z 1
Z 1
u
fU (u) =
fU,V (u, v) dv =
e
v2
1
1

u
e
v

dv = ue

1
u
= 2e
|v|
v

1
1

1
dv = ue
v2

for u > 0 (which happens to be the density function of a (2, 1) random variable). As an
added bonus, we have also shown that X(1 + Y ) and (1 + Y ) are independent.
Example. Suppose that the random vector (X, Y )0 has joint density function
(
e y , if 0 < x < y < 1,
fX,Y (x, y) =
0,
otherwise.
Determine the density function of the random variable X + Y .
Solution. If U = X + Y and V = Y , then solving for X and Y gives
X=U

and Y = V.

81

The Jacobian of this transformation is


@x
@u
J=
@y
@u

@x
1
@v
=
0
@y
@v

1
= 1.
1

Therefore, we conclude
v, v) |J| = e

fU,V (u, v) = fX,Y (u

1=e

()

provided that 0 < v < u < 2v < 1 (or, equivalently, u2 < v < u). The marginal for U is
given by
Z 1
Z u
u
fU (u) =
fU,V (u, v) dv =
e v dv =
e v
= e u/2 e u , u > 0.
1

u/2

u/2

In particular, note that U and V are NOT independent, even though () suggests that they
are. The reason, of course, is because of the dependence of u on v in the limits of integration.
Example. The purpose of this example is to prove a relationship between the gamma function and the beta integral that was used last lecture. That is, show that if a > 0 and b > 0,
then
Z 1
(a) (b)
xa 1 (1 x)b 1 dx =
.
(a + b)
0
Proof. The solution is to consider the product (a) (b) as a double integral, change variables,
and simplify. That is,
Z 1
Z 1
Z 1Z 1
a 1
x
b 1
y
(a) (b) =
x e dx
y e dy =
xa 1 y b 1 e (x+y) dx dy.
0

Let u = x + y and v =
transformation is

x
x+y

1
0

a 1 b 1

so that x = uv and y = u(1


J=

implying
Z

(x+y)

v
1

dx dy =

u
=
u

0
Z

(uv)a 1 (u(1

v))b 1 e u u du dv
Z 1
1
u
e du
v a 1 (1 v)b 1 dv

ua+b
0
Z
= (a + b)
=

v). The Jacobian of this

as required.

82

v a 1 (1
0

v)b

dv

You might also like