02-Random Variables
02-Random Variables
3
Conditions for a function to be a RV.
A function X to be a RV:
1. It should not be multi-valued.
multi-valued i.e every point in the Ω
must correspond to only one value of the RV.(one-to-one OR
many-to-one)
2. The set {X≤x} shall be an event for any real number x. This set
corresponds to those points ω in Ω for which the RV X(ω)
doesn’t exceed the number x.
The probability of this event, P{X≤x}, is equal to sum of
probability of all the elementary events corresponding to
{X≤x}.Call it Cumulative distribution Function (CDF)
3. P{X=∞}=0 and P{X=-∞}=0 .outcomes chance of being infinity
4
Example:
Consider a random experiment of tossing a fair coin 3 times. The
sequence of heads and tails is noted and the sample space Ω is
given by: {HHH , HHT , HTH , THH , THT , HTT , TTH , TTT}
Let X be the number of heads in three coin tosses.
X assigns each possible outcome ω in the sample space Ω a
number from the set RX={0, 1, 2, 3}.
iii. lim FX ( x) 0
x
6
The Cumulative Distribution Function Cont’d…..
iv. FX ( x) is a non - decreasing function of X , i.e.,
If x1 x2 , then FX ( x1 ) FX ( x2 )
v. P ( x1 X x2 ) FX ( x2 ) FX ( x1 )
vi. P ( X x) 1 FX ( x)
Example:
Find the cdf of the random variable X which is defined as the
number of heads in three tosses of a fair coin.
7
The Cumulative Distribution Function
Solution:
We know that X takes on only the values 0, 1, 2 and 3 with
probabilities 1/8, 3/8, 3/8 and 1/8 respectively.
Thus, FX(x) is simply the sum of the probabilities of the
outcomes from the set {0, 1, 2, 3} that are less than or equal to x.
0, x 0
1 / 8, 0 x 1
FX ( x) 1 / 2, 1 x 2
7 / 8, 2 x 3
1, x 3
8
Types of Random Variables
There are two basic types of random variables.
i. Continuous Random Variable
A random variable whose cdf, FX(x), is continuous every where and can
be written as an integral of some non-negative function f(x), i.e.,
FX ( x) f (u )du
10
The Probability Mass Function
The probability mass function (pmf)
(pmf of a discrete random
variable X is defined as:
PX ( X xi ) PX ( xi ) FX ( xi ) FX ( xi 1 )
ii. PX ( x) 0, if x xk , k 1, 2, .....
iii. Pk
X ( xk ) 1
11
Calculating the Cumulative Distribution Function
The cdf of a continuous random variable X can be obtained
by integrating the pdf, i.e.,
x
FX ( x) f X (u )du
FX ( x) P
xk x
X ( xk )U ( x xk )
12
Expected Value, Variance and Moments
I. Expected Value (Mean)
The expected value (mean) of a continuous random variable X,
denoted by μX or E(X), is defined as:
X E ( X ) xf X ( x)dx
The variance
2 X of
Vara (discrete
X ) random
( x variable
k
k) 2 P ( xX )is given by:
X X k
14
Expected Value, Variance and Moments Cont’d…..
The standard deviation of a random variable X, denoted by σX, is
simply the square root of the variance, i.e.,
X E ( X X ) 2 Var ( X )
III. Moments
The nth moment of a continuous random variable X is defined as:
E ( X ) x n f X ( x)dx ,
n
n 1
16
2. Binomial Distribution
A r.v. X is called a binomial r.v. with parameters (n, p) if its pmf
is given by
n k n k
P( X k ) k p q , k 0,1,2, , n.
It is associated with some experiments in which n
independent Bernoulli trials are performed and X represents
the number of successes that occur in the n trials
Note that a Bernoulli r.v. is just a binomial r.v. with
parameters (1, p).
Its mean and variance are
18
3. Poisson Distribution
A r.v. X is called a Poisson r.v. with parameter λ(>0) if its pmf is
given by k
P( X k ) e , k 0,1,2, , .
k!
It may be used as an approximation for a binomial r.v. with
parameters (n, p) when n is large and p is small enough so
that np is of a moderate size
Some examples of Poisson r.v.'s include
I. number of telephone calls arriving at a switching center during various time
intervals
II. The number of misprints on a page of a book
III. The number of customers entering a bank during various intervals of time
IV. photoelectric effect and radioactive decay
V. computer message traffic arriving at a queue for transmission.
The mean and variance of the Poisson r.v. X
19
Example (Poisson):
Poisson)
suppose that the probability of a transistor manufactured by a
certain firm being defective is 0.015. What is the probability that
there is no defective transistor in a batch of 100?
o Solution: let X be the number of defective transistors in 100. The
desired probability (binomial) is
21
Example (Geometric):
Geometric)
A driver is eagerly eyeing a precious parking space some
distance down the street. There are five cars in front of the
driver, each of which having a probability 0.2 of taking the
space. What is the probability that the car immediately
ahead will enter the parking space?
Solution:
For this problem, we have a geometric distribution and need
to evaluate it with k=5 and p=0.2.Thus,
22
5. Hypergeometric Distribution
The hypergeometric random variable arises in the following
situation. We have a collection of N items, d of which are
defective. Rather than test all N items, we select at random a
small number of items, say n < N. N Let X denote the number of
defectives out of the n items tested. We show that
23
6. Negative Binomial Distribution
A natural generalization of the geometric distribution is the
distribution of random variable X representing the number of
Bernoulli trials necessary for the rth success to occur, where r is a
given positive integer.
In order to determine pX (k) for this case, let A be the event that the
first k -1 trials yield exactly r -1 successes, regardless of their order,
and B the event that a success turns up at the kth trial. Then, owing
to independence,
24
Example (Negative Binomial ): )
a curbside parking facility has a capacity for three cars.
Determine the probability that it will be full within 10
minutes. It is estimated that 6 cars will pass this parking
space within the time span and, on average, 80% of all cars
will want to park there.
Solution: the desired probability is simply the probability
that the number of trials to the third success (taking the
parking space) is less than or equal to 6. If X is this number,
it has a negative binomial distribution with r =3 and p =0.8.
25
Some Special Distributions with their Special application
II. Continuous Probability Distributions
1. Uniform Distribution
When an experiment results in a finite number of “equally
likely” or “totally random” outcomes, we model it with a
uniform random variable
pdf & cdf of X which is constant over interval (a, b) has the form
26
Example 1 (Uniform Distribution)
owing to unpredictable traffic situations, the time required by a
certain student to travel from her home to her morning class
is uniformly distributed between 22 and 30 minutes.
If she leaves home at precisely 7.35 a.m., what is the
probability that she will not be late for class, which begins
promptly at 8:00 a.m.?
Solution: let X be the class arrival time of the student in minutes
after 8:00 a.m. It then has a uniform distribution given by
27
Example 2 (Uniform Distribution)
Solution:
28
2. Exponential Distribution
RV X is called exponential written
f ∼ exp(λ ) with parameter λ >
0 if
30
Example 2(Exponential)
All manufactured devices and machines fail to work sooner or
later. Suppose that the failure rate is constant and the time to
failure (in hours) is an exponential r.v. X with parameter λ.
Measurements show that the probability that the time to failure for
computer memory chips in a given class exceeds l04 hours is .368.
Calculate the value of the parameter λ.
Using the value of the parameter λ determined in part (a), calculate
the time x0, such that the probability that the time to failure is less
than x0, is 0.05.
31
3. Laplace / double-sided exponential
For λ > 0, we write f ∼ Laplace(λ ) if its pdf is
32
Solution. The desired probability can be written as
P({−3 ≤ X ≤−2}∪{0 ≤ X ≤ 3}).
Since these are disjoint events, the probability of the union
is the sum of the individual probabilities.
We therefore need to compute P(−3 ≤ X ≤−2) and P(0 ≤ X ≤ 3).
33
4. Cauchy Distribution
The pdf of a Cauchy random variable X∼ Cauchy(λ ) with
parameter λ > 0 is given by
34
5. Gaussian or normal distribution
The most important density is the Gaussian or normal. For
σ2 > 0, we write X ∼ N(m,σ2) if its pdf is given by
35
Due to central limit theorem, the Gaussian density is a good
approximation for computing probabilities involving a sum of
many independent random variables. For example, let
X = X1+· · ·+Xn,
where the Xi are i.i.d. with common mean m and common
variance σ2. For large n, if the Xi are continuous random
variables, then
37
Similarly, for any a < b,
38
Example 1 (normal): If X is a normal random variable
with mean m = 3 and variance σ2 = 16, find
(a) P{X < 11}; (b) P{X > −1}; (c) P{2 < X < 7}.
39
Example 2 (normal): A production line manufactures 1000-ohm
(R) resistors that have 10 percent tolerance. Let X denote the
resistance of a resistor. Assuming that X is a normal r.v. with
mean 1000 and variance 2500, find the probability that a resistor
picked at random will be rejected.
Solution: Let A be the event that a resistor is rejected. Then
A = {X < 900) u {X > 1100). Since (X < 900) n{X > 1100) = Φ, we have
40
Location & scale parameters and
the gamma densities
41
6. Gamma Distribution
An important application of the scale parameter arises with
the basic gamma density with parameter p > 0. This density is
given by
Example-1:
The pdf of a continuous random variable is given by:
kx , 0 x 1
f X ( x)
0 , otherwise
where k is a constant.
a. Determine the value of k .
b. Find the corresponding cdf of X .
c. Find P(1 / 4 X 1)
d . Evaluate the mean and variance of X .
44
Random Variable Examples Cont’d……
Solution:
1
a.
f X ( x ) dx 1
0
kxdx 1
x2 1
k 1
2 0
k
1
2
k 2
2 x , 0 x 1
f X ( x)
0, otherwise
45
Random Variable Examples Cont’d……
Solution:
b. The cdf of X is given by :
x
FX ( x )
f X (u ) du
Case 1 : for x 0
FX ( x ) 0, since f X ( x ) 0, for x 0
Case 2 : for 0 x 1
x x x
FX ( x ) f X (u ) du 2udu u x2
2
0 0 0
46
Random Variable Examples Cont’d……
Solution:
Case 3 : for x 1
1 1 1
FX ( x ) f X (u ) du 2udu u 1
2
0 0 0
The cdf is given by
0, x0
2
FX ( x ) x , 0 x 1
1, x 1
47
Random Variable Examples Cont’d……
Solution:
c. P (1 / 4 X 1)
i. Using the pdf
1 1
P (1 / 4 X 1) f X ( x) dx 2 xdx
1/ 4 1/ 4
1
P (1 / 4 X 1) x 2
15 / 16
1/ 4
P (1 / 4 X 1) 15 / 16
ii. Using the cdf
P (1 / 4 X 1) FX (1) FX (1 / 4)
P (1 / 4 X 1) 1 (1 / 4) 2 15 / 16
P (1 / 4 X 1) 15 / 16
48
Random Variable Examples Cont’d……
Solution:
d. Mean and Variance
i. Mean
1 1
X E ( X ) xf X ( x)dx 2 x 2 dx
0 0
2 x3 1
X 2/3
3 0
ii. Variance
X 2 Var ( X ) E ( X 2 ) [ E ( X )]2
1 1
E ( X ) x f X ( x) dx 2 x 3 dx 1 / 2
2 2
0 0
X Var ( x) 1 / 2 ( 2 / 3) 2 1 / 18
2
49
Random Variable Examples Cont’d……..
Example-2:
Consider a discrete random variable X whose pmf is given by:
1 / 3 , xk 1, 0, 1
PX ( xk )
0 , otherwise
50
Random Variable Examples Cont’d……
Solution:
i. Mean
1
X E( X ) x
k 1
k PX ( xk ) 1 / 3(1 0 1) 0
ii. Variance
X 2 Var ( X ) E ( X 2 ) [ E ( X )]2
1
k X k
2
E( X ) 2
x P ( x ) 1 / 3[( 1) 2
( 0) 2
(1) 2
] 2/3
k 1
X Var ( x) 2 / 3 (0) 2 2 / 3
2
51
Functions of random variables
Most modern systems today are composed of many subsystems in
52
Functions of One Random Variable
Let X be a continuous random variable with pdf fX(x) and suppose g(x) is a function of
the random variable X defined as:
Y g( X )
We can determine the cdf and pdf of Y in terms of that of X.
Consider some of the following functions.
aX b
sin X X2
1
Y g( X ) |X |
X
X
log X
eX | X | U ( x)
53
Functions of a Random Variable Cont’d…..
Steps to determine fY(y) from fX(x):
Method I:
1. Sketch the graph of Y=g(X) and determine the range space of Y.
2. Determine the cdf of Y using the following basic approach.
FY ( y ) P ( g ( X ) y ) P (Y y )
dFY ( y )
fY ( y )
dy
54
Functions of a Random Variable Cont’d…..
Method II:
1. Sketch the graph of Y=g(X) and determine the range space of Y.
2. If Y=g(X) is one to one function and has an inverse transformation x=g-1(y)=h(y), then the pdf of Y is given by:
dx dh( y )
fY ( y ) f X ( x) f X [h( y )]
dy dy
55
Functions of a Random Variable Cont’d…..
ii. Determine the derivative of function g(xi ) at every real root xi , i.e. ,
dx
g ( xi )
iii. Find the pdf of Y by using the followingi formula.
dy
dxi
f Y ( y) f X ( xi ) g ( xi ) f X ( xi )
i dy i
56
Examples on Functions of One Random Variable
Examples:
a. Let Y aX b. Find f Y ( y ).
b. Let Y X 2 . Find f Y ( y ).
1
c. Let Y . Find f Y ( y ).
X
d . The random variable X is uniform in the interval [ , ].
2 2
If Y tan X , determine the pdf of Y .
57
Examples on Functions of One Random Variable…..
Solutions:
a. Y aX b
i. Using Method I
Suppose that a 0
y b
Fy ( y ) P (Y y ) P (aX b y ) P X
a
y b
FY ( y ) FX
a
dFY ( y ) 1 y b
f Y ( y) fX (i )
dy a a
58
Examples on Functions of One Random Variable…..
Solutions:
a. Y aX b
i. Using Method I
y b
Fy ( y ) P (Y y ) P (aX b y ) P X
a
y b
FY ( y ) 1 FX
a
dFY ( y ) 1 y b
f Y ( y) fX (ii )
dy a a
59
Examples on Functions of One Random Variable…..
Solutions:
a. Y aX b
i. Using Method I
1 y b
f Y ( y) fX , for all a
a a
60
Examples on Functions of One Random Variable…..
Solutions:
a. Y aX b
ii. Using Method II
The function Y aX b is one - to - one and the range
space of Y is IR
y b
For any y, x h( y ) is the principal solution
a
dx dh( y ) 1 dx 1
dy dy a dy a
dx dh y 1 y b
f Y ( y) f X ( x) f X h( y ) f Y ( y ) fX
dy dy a a
61
Examples on Functions of One Random Variable…..
Solutions:
62
Examples on Functions of One Random Variable…..
Solutions:
dx1 1 dx1 1
b. and
dy 2 y dy 2 y
dx2 1 dx2 1
dy 2 y dy 2 y
1
2 y f X y f y ,
X y0
f Y ( y)
0, otherwise
63
Examples on Functions of One Random Variable…..
Solutions:
1
c. The function Y is one - to - one and the range
X
space of Y is IR /0
1
For any y , x h( y ) is the principal solution
y
dx dh( y ) 1
2
dy dy y
dx dh y 1 1
f Y ( y) f X ( x) f X h( y ) f Y ( y ) 2 f X
dy dy y y
1 1
f Y ( y) 2
f X
, IR /0
y y
64
Examples on Functions of One Random Variable…..
Solutions:
d . The function Y tan X is one - to - one and the range
space of Y is (, )
dx dh( y ) 1
dy dy 1 y2
dx dh y 1/
f Y ( y) f X ( x) f X h( y ) f Y ( y )
dy dy 1 y2
1
f Y ( y) , y
(1 y )
2
65
Examples on Functions of One Random Variable…..
Solutions:
66
Transform methods
We will see two transforms that help us recover probability
mass functions and densities and that greatly simplify finding
expectations, variances & moments of o random variables.
1. Moment generating functions
The moment generating function (mgf) of a real-valued random
variable X is defined by
Taking s = 0, we have
where k is a constant.
Find :
a. the value of k .
b. the cdf of X .
c. P( X 1)
d . the mean and variance of X .
69
Assignment-II Cont’d…..
where k is a constant.
Determine :
a. the value of k .
b. the pdf of X .
c. the mean and variance of X .
70
Assignment-II Cont’d…..
71
Functions That Give Moments
Two functions can be defined that allow moments to be
calculated for a random variable X.
They are the characteristic function and the moment
generating function.
Characteristic Function
The characteristic function of a random variable X is defined by
72
Because of this fact, if is known, can be found from the
inverse Fourier transform (with sign of x reversed).
74
The derivative of is
75
Example :To illustrate the calculation and use of the
moment generating function, let us reconsider the
exponential density of the earlier example.
76
Moment Generating Function
Another statistical average closely related to the
characteristic function is the moment generating function,
defined by
77