Lecture16 General Transformations of RVs PDF
Lecture16 General Transformations of RVs PDF
In the previous lectures, we have seen few elementary transformations such as sums of random variables
as well as maximum and minimum of random variables. Now we will look at general transformations of
random variables. The motivation behind transformation of a random variable is illustrated by the following
example. Consider a situation where the velocity of a particle is distributed according to a random variable
V . Based on a particular realisation of the velocity, there will be a corresponding value of kinetic energy E
and we are interested in the distribution of kinetic energy. Clearly, this is a scenario where we are asking
for the distribution of a new random variable, which depends on the original random variable through a
transformation. Such situations occur often in practical applications.
(Ω, F , P) f ◦ X(·)
b X(·) f (·)
ω b
R f (X(ω))
b
R X(ω)
f :R→R
Consider a random variable X : Ω → R and let g : R → R be a Borel measurable function. Then Y = g (X)
is also a random variable and we wish to find the distribution of Y . Specifically, we are interested in finding
the CDF FY (y) given the CDF FX (x).
Let By be the set of all x such that g(x) ≤ y. Then FY (y) = PX (By ).
We now illustrate this with the help of an example.
Example 1: Let X be a Gaussian random variable of mean 0 and variance 1 i. e. X ∼ N (0, 1). Find the
distribution of Y = X 2 .
16-1
16-2 Lecture 16: General Transformations of Random Variables
Solution:
√
Zy
√ √ 1 −t2 √ √
FY (y) = P(X 2 ≤ y) = P(− y ≤ X ≤ y) = √ e 2 dt = Φ( y) − Φ(− y),
√ 2π
− y
1
fY (y) = √ y > 0.
−y
e 2 , for
2πy
Note:
1. The random variable Y can take only non-negative values as it is square of a real valued random
variable.
2. The distribution of square of the Gaussian random variable, fY (y), is also known as Chi-squared
distribution.
Thus, we see that given the distribution of a random variable X, the distribution of any function of X can
be obtained by first principles. We now come up with a direct formula to find the distribution of a function
of the random variable in the cases where the function is differentiable and monotonic.
Let X have a density fX (x) and g be a monotonically increasing function and let Y = g(X). We then have
g −1
Z (y)
FY (y) = P(Y ≤ y) = P(g(X) ≤ y) = fX (x)dx.
−∞
Zy
dt
FY (y) = fX (g −1 (t)) .
g 0 (g −1 (t))
−∞
Lecture 16: General Transformations of Random Variables 16-3
Differentiating, we get
1
fY (y) = fX (g −1 (y)) .
g 0 (g −1 (y))
The second term on the right hand side of the above equation is referred to as the Jacobian of the transfor-
mation g(·).
It can be shown easily that a similar argument holds for a monotonically decreasing function g as well and
we obtain
−1
fY (y) = fX (g −1 (y)) 0 −1 .
g (g (y))
Hence, the general formula for distribution of monotonic functions of random variables is as under
fX (g −1 (y))
fY (y) = . (16.1)
| g 0 (g −1 (y)) |
Example 3: Let U ∼ unif [0, 1] i.e. U is a uniform random variable in the interval [0,1]. Find the distri-
bution Y = − ln(U ).
Solution: Note that g(u) = − ln(u) is a differentiable, monotonically decreasing function.
As g(u) = − ln(u), we have g −1 (y) = e−y and g 0 (g −1 (y)) = e−1 −y . Here we see that the Jacobian will be
If X is a continous random variable with CDF FX (·), then it can be shown that the random variable
Y = F (X) is uniformly distributed over [0, 1] (see Exercise 2(a)). It can be seen from this result that any
continuous random variable Y can be generated from a uniform random variable X ∼ unif [0, 1] by the
transformation Z = FY−1 (X) where FY (·) is the CDF of the random variable Y .
Equation 16.1 can be extended to transformations of multiple random variables. Consider an n-tuple
random variable (X1 , X2 , . . . , Xn ) whose joint density is given by f(X1 ,X2 ,...,Xn ) (x1 , x2 , . . . , xn ) and the
16-4 Lecture 16: General Transformations of Random Variables
Y
( X (⋅), Y (⋅))
Ω
ω ( X (ω ), Y (ω ))
R(ω )
θ (ω )
X
Figure 16.2: Mapping of a realization (X(ω), Y (ω)) to the polar co-ordinates (R(ω), Θ(ω))
fY1 ,Y2 ,...,Yn (y1 , y2 , . . . , yn ) = f(X1 ,X2 ,...,Xn ) (g1−1 (y))|J(y)|, (16.2)
Let us first evaluate the Jacobian. We have x = r cos θ and y = r sin θ. So we have ∂x
∂r = cos θ, ∂y
∂r = sin θ,
∂θ = −r sin θ and ∂θ = r cos θ.
∂x ∂y
= cos θ sin θ
∂y
∂x
J = = r cos2 θ + r sin2 θ = r(cos2 θ + sin2 θ) = r.
∂r ∂r
∂x ∂y r sin θ r cos θ
∂θ ∂θ
1 − x2 +y2
fX,Y (x, y) = fX (x)fY (y) = e 2 x, y ∈ R.
2π
The marginal densities of R and Θ can be obtained from the joint distribution as given below.
Z2π
fR (r) = fR,Θ (r, θ)dθ = re−r , for r ≥ 0.
2
/2
Z∞
1
fΘ (θ) = fR,Θ (r, θ)dr = , for θ ∈ [0, 2π].
2π
0
The distribution fR (r) is called the Rayleigh distribution, which is encountered quite often in Wireless Com-
munications to model the gain of a fading channel. Note that the random variables R and Θ are independent
since the joint distribution factorizes into the product of the marginals i.e.
We now illustrate how transformations of random variables help us to generate random variables with
different distributions given that we can generate only uniform random variables. Specifically, consider the
case where all we can generate is a uniform random variable between 0 and 1 i. e. unif [0, 1] and we wish
to generate random variables having Rayleigh, exponential and Guassian distribution.
Generate U1 and U2 as i.i.d. unif [0, 1]. Next, let Θ = 2πU1 and Z = − ln(U 2 . It can be verified that
2)
16.3 Exercises
√
1. Let X ∼ exp(0.5). Prove that Y = X is a Rayleigh distributed random variable.
(i) Show that the Random Variable Y = F (X) is uniformly distributed over [0, 1]. [Hint: Al-
though F is the distribution of X, regard it simply as a function satisfying certain properties
required to make it a CDF !]
(ii) Now, given that Y = y, a random variable Z is distributed as Geometric with parameter
y. Find the unconditional PMF of Z. Also, given Z = z for some z ≥ 1, z ∈ N find the
conditional PMF of Y .
(b) Let X be a continuous random variable with the pdf
e−x x≥0
fX (x) =
0 x < 0.
0<y<1
1
√
fY (y) = 2 y
0 otherwise.
5. A random variable Y has the pdf fY (y) = Ky −(b+1) , y ≥ 2 (and zero otherwise), where b > 0. This
random variable is obtained as the monotonically increasing transformation Y = g(X) of the random
variable X with pdf e−x , x ≥ 0.
(a) Determine K in terms of b.
(b) Determine the transformation g(.) in terms of b.
6. (a) Two particles start from the same point on a two-dimensional plane, and move with speed V each,
such that the angle between them is uniformly distributed in [0, 2π]. Find the distribution of the
magnitude of the relative velocity between the two particles.
(b) A point is picked uniformly from inside a unit circle. What is the density of R, the distance of
the point from the center?
7. Let X and Y be independent exponentially distributed random variables with parameter 1. Find the
joint density of U = X + Y and V = X+Y
X
, and show that V is uniformly distributed.
References
[1] David Gamarnick and John Tsitsiklis, “Introduction to Probability”, MIT OCW, , 2008.