Week 11
Week 11
𝑓𝑋 𝑥 = 𝑃𝑋 (𝑥𝑖 )𝛿(𝑥 − 𝑥𝑖 )
𝑎𝑙𝑙 𝑥
• When the PDF includes delta functions of the form 𝛿(𝑥 − 𝑥𝑖 ), we say there is an
impulse at 𝑥𝑖 .
Families of Continuous Random
Variables
• When we graph a PDF 𝑓𝑋 (𝑥) that contains an impulse at 𝑥𝑖 , we draw a vertical
arrow labeled by the constant that multiplies the impulse. We draw each arrow
representing an impulse at the same height because the PDF is always infinite at
each such point.
• To calculate the expected value:
+∞
𝐸[𝑋] = න 𝑥 ( 𝑃𝑋 (𝑥𝑖 )𝛿(𝑥 − 𝑥𝑖 ))𝑑𝑥
−∞ 𝑎𝑙𝑙 𝑥
Families of Continuous Random
Variables
• Example 1: Suppose Y takes on the values 1, 2, 3 with equal probability. The PMF
and the corresponding CDF of Y are (figure in previous page)
1
𝑃𝑌 𝑦 = ቐ3 𝑦 = 1, 2, 3
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Using the unit step function u(y), we can write CDF 𝐹𝑌 (𝑦) and using impulse
function we can find the PDF as follow:
0 𝑥 < −1
𝑥+1
• 𝐹𝑋 𝑥 = ൞ 4 −1≤𝑥 <1
1 𝑥≥1
• Sketch the CDF and find the following:
• (1) P[X ≤ 1]
• (2) P[X < 1]
• (3) P[X = 1]
• (4) the PDF 𝑓𝑋 (𝑥)
Families of Continuous Random
Variables
• Example 3: Observe someone dialing a telephone and record the duration of the
call. In a simple model of the experiment, 1/3 of the calls never begin either
because no one answers or the line is busy. The duration of these calls is 0 minutes.
Otherwise, with probability 2/3, a call duration is uniformly distributed between 0
and 3 minutes. Let Y denote the call duration. Find the CDF , the PDF , and the
expected value E[Y]?
Families of Continuous Random
Variables
• Conditioning a Continuous Random Variable:
• Definition: Conditional PDF given an Event
• For a random variable 𝑋 with PDF 𝑓𝑋 (𝑥) and an event 𝐵 ⊂ 𝑆𝑋 with 𝑃[𝐵] > 0,
the conditional PDF of X given B is
𝑓𝑋 (𝑥)
𝑓𝑋|𝐵 𝑥 = ቐ 𝑃 𝐵 𝑥𝜖𝐵
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Theorem: Given an event space {𝐵𝑖 } and the conditional PDFs 𝑓𝑋|𝐵𝑖 (𝑥),
𝑓𝑋 𝑥 = 𝑓𝑋|𝐵𝑖 (𝑥)𝑃[𝐵𝑖 ]
𝑎𝑙𝑙 𝑖
Families of Continuous Random
Variables
• Definition: Conditional Expected Value Given an Event
• If {𝑥 ∈ 𝐵}, the conditional expected value of X is
+∞
𝐸 𝑋|𝐵 = න 𝑥𝑓𝑋|𝐵 𝑥 𝑑𝑥
−∞
• To find the variance:
+∞
𝐸 𝑋 2 |𝐵 =න 𝑥 2 𝑓𝑋|𝐵 𝑥 𝑑𝑥
−∞
𝑉𝑎𝑟 𝑋 𝐵 = 𝐸 𝑋 2 |𝐵 − 𝐸 2 𝑋|𝐵
Families of Continuous Random
Variables
• Example: Suppose the duration T (in minutes) of a telephone call is an exponential
(1/3) random variable:
1 −𝑡Τ
𝑒 3 𝑡≥0
• 𝑓𝑇 𝑡 = ൝3
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• For calls that last at least 2 minutes, what is the conditional PDF of the call
duration? Find the expected value?
Families of Continuous Random
Variables
• Example: The probability density function of random variable Y is
1
𝑓𝑌 𝑦 = ቐ10 0 ≤ 𝑦 < 10
0 𝑜𝑡ℎ𝑒𝑟𝑠
• Find the following:
• (1) P[Y ≤ 6]
• (2) the conditional PDF 𝑓𝑌 |𝑌≤6 (𝑦)
• (3) P[Y > 8]
• (4) the conditional PDF 𝑓𝑌 |𝑌>8 (𝑦)
• (5) E[Y |Y ≤ 6]
• (6) E[Y |Y > 8]
Joint Random Variables
• Example: Test two integrated circuits one after the other. On each test, the possible
outcomes are a (accept) and r (reject). Assume that all circuits are acceptable with
probability 0.9 and that the outcomes of successive tests are independent. Count the
number of acceptable circuits X and count the number of successful tests Y before
you observe the first reject. (If both tests are successful, let Y = 2.) Draw a tree
diagram for the experiment and find the joint PMF of X and Y.
Joint Random Variables
• Theorem: For discrete random variables X and Y and any set B in the X, Y plane,
the probability of the event {(X, Y ) ∈ B} is
𝑃 𝐵 = 𝑃𝑋,𝑌 (𝑥, 𝑦)
(𝑥,𝑦)∈𝐵
• Example: Continuing the previous Example, find the probability of the event B that
X, the number of acceptable circuits, equals Y, the number of tests before observing
the first failure.
• Example: The joint PMF 𝑃𝑄,𝐺 (𝑞, 𝑔) for random variables Q and G is given in the
following table:
• Calculate the following
probabilities:
• (1) P[Q = 0] (2) P[Q = G]
• (3) P[G > 1] (4) P[G > Q]
Joint Random Variables
• Marginal PMF
• In an experiment that produces two random variables X and Y , it is always possible to
consider one of the random variables, Y , and ignore the other one, X. In this case, we can use
the methods of Chapter 2 to analyze the experiment and derive 𝑃𝑌 (𝑦), which contains the
probability model for the random variable of interest.
• On the other hand, if we have already analyzed the experiment to derive the joint PMF
𝑃𝑋,𝑌 (𝑥, 𝑦), it would be convenient to derive 𝑃𝑌 (𝑦) from 𝑃𝑋,𝑌 (𝑥, 𝑦) without reexamining the
details of the experiment.
• Theorem: For discrete random variables X and Y with joint PMF 𝑃𝑋,𝑌 (𝑥, 𝑦),
𝑃𝑋 𝑥 = 𝑃𝑋,𝑌 (𝑥, 𝑦) , 𝑃𝑌 (𝑦) = 𝑃𝑋,𝑌 (𝑥, 𝑦) ,
𝑦∈𝑆𝑌 𝑋∈𝑆𝑋
• It implies that we can find 𝑃𝑌 (𝑦) by summing 𝑃𝑋,𝑌 (𝑥, 𝑦) over all points in 𝑆𝑋,𝑌 with the
property 𝑌 = 𝑦. In the sum, y is a constant, and each term corresponds to a value of 𝑥 ∈ 𝑆𝑋 .
Similarly, we can find 𝑃𝑋 (𝑥) by summing 𝑃𝑋,𝑌 (𝑥, 𝑦) over all points 𝑋, 𝑌 such that 𝑋 = 𝑥.
Joint Random Variables
𝜕 2 𝐹𝑋,𝑌 𝑥, 𝑦
𝑓𝑋,𝑌 𝑥, 𝑦 =
𝜕𝑥𝜕𝑦
Joint Random Variables
• Theorem:
𝑃 𝑥1 ≤ 𝑋 ≤ 𝑥2 , 𝑦1 ≤ 𝑌 ≤ 𝑦2
= 𝐹𝑋,𝑌 𝑥2 , 𝑦2 − 𝐹𝑋,𝑌 𝑥2 , 𝑦1 − 𝐹𝑋,𝑌 𝑥1 , 𝑦2 + 𝐹𝑋,𝑌 (𝑥1 , 𝑦1 )
• Theorem:
• A joint PDF 𝑓𝑋,𝑌 𝑥, 𝑦 has the following properties corresponding to first and
second axioms of probability
• 𝑓𝑋,𝑌 𝑥, 𝑦 ≥ 0 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥, 𝑦
∞ ∞
• −∞ −∞ 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1
• Theorem: The probability that the continuous random variables (X, Y ) are in A is
𝑃 𝐴 = ඵ 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦
𝐴
Joint Random Variables
• Marginal PDF
• Same as the marginal PMF, sometimes we need to find marginal PDF 𝑓𝑋 (𝑥) or
𝑓𝑌 (𝑦) from the joint PDF 𝑓𝑋,𝑌 (𝑥, 𝑦). Therefore, we refer to 𝑓𝑋 (𝑥) and 𝑓𝑌 (𝑦) as the
marginal probability density functions of 𝑓𝑋,𝑌 (𝑥, 𝑦).
• If X and Y are random variables with joint PDF 𝑓𝑋,𝑌 (𝑥, 𝑦),
∞ ∞
𝑓𝑋 𝑥 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑦 𝑓𝑌 𝑦 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥
−∞ −∞
Joint Random Variables