0% found this document useful (0 votes)
6 views

Week 11

Uploaded by

louay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Week 11

Uploaded by

louay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

PROBABILITY AND STATISTICS

Dr. Öğr. Üyesi Sadra Mousavi


1𝑠𝑡 semester 2022-2023
Families of Continuous Random
Variables
• Delta Functions
• Unit Impulse (Delta) function:
• Suppose the following function:
1ൗ −𝜖 ≤ 𝑥 ≤ 𝜖
𝑑𝜖 𝑥 = ቐ 𝜖 2 2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• The unit impulse function is:
𝛿 𝑥 = lim 𝑑𝜖 (𝑥)
𝜖→0
Families of Continuous Random
Variables

• For each 𝜖 the area under the curve of 𝑑𝜖 (𝑥) equals 1.


Families of Continuous Random
Variables
• Theorem: For any continuous function g(x),
+∞
න 𝑔 𝑥 𝛿 𝑥 − 𝑥0 𝑑𝑥 = 𝑔(𝑥0 )
−∞
• Unit Step Function
0 𝑥<0
𝑢 𝑥 =ቊ
1 𝑥≥0

• The relation between unit impulse and unit step functions:


𝑑𝑢(𝑥)
𝛿 𝑥 =
𝑑𝑥
Families of Continuous Random
Variables
• Lets assume that we have a PDF function as below:

𝑓𝑋 𝑥 = ෍ 𝑃𝑋 (𝑥𝑖 )𝛿(𝑥 − 𝑥𝑖 )
𝑎𝑙𝑙 𝑥
• When the PDF includes delta functions of the form 𝛿(𝑥 − 𝑥𝑖 ), we say there is an
impulse at 𝑥𝑖 .
Families of Continuous Random
Variables
• When we graph a PDF 𝑓𝑋 (𝑥) that contains an impulse at 𝑥𝑖 , we draw a vertical
arrow labeled by the constant that multiplies the impulse. We draw each arrow
representing an impulse at the same height because the PDF is always infinite at
each such point.
• To calculate the expected value:
+∞
𝐸[𝑋] = න 𝑥 ( ෍ 𝑃𝑋 (𝑥𝑖 )𝛿(𝑥 − 𝑥𝑖 ))𝑑𝑥
−∞ 𝑎𝑙𝑙 𝑥
Families of Continuous Random
Variables
• Example 1: Suppose Y takes on the values 1, 2, 3 with equal probability. The PMF
and the corresponding CDF of Y are (figure in previous page)
1
𝑃𝑌 𝑦 = ቐ3 𝑦 = 1, 2, 3
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Using the unit step function u(y), we can write CDF 𝐹𝑌 (𝑦) and using impulse
function we can find the PDF as follow:

• Consequently We see that the discrete random variable Y can be represented


graphically either by a PMF or CDF or PDF
Families of Continuous Random
Variables
• Example 2: The cumulative distribution function of random variable X is

0 𝑥 < −1
𝑥+1
• 𝐹𝑋 𝑥 = ൞ 4 −1≤𝑥 <1
1 𝑥≥1
• Sketch the CDF and find the following:
• (1) P[X ≤ 1]
• (2) P[X < 1]
• (3) P[X = 1]
• (4) the PDF 𝑓𝑋 (𝑥)
Families of Continuous Random
Variables
• Example 3: Observe someone dialing a telephone and record the duration of the
call. In a simple model of the experiment, 1/3 of the calls never begin either
because no one answers or the line is busy. The duration of these calls is 0 minutes.
Otherwise, with probability 2/3, a call duration is uniformly distributed between 0
and 3 minutes. Let Y denote the call duration. Find the CDF , the PDF , and the
expected value E[Y]?
Families of Continuous Random
Variables
• Conditioning a Continuous Random Variable:
• Definition: Conditional PDF given an Event
• For a random variable 𝑋 with PDF 𝑓𝑋 (𝑥) and an event 𝐵 ⊂ 𝑆𝑋 with 𝑃[𝐵] > 0,
the conditional PDF of X given B is
𝑓𝑋 (𝑥)
𝑓𝑋|𝐵 𝑥 = ቐ 𝑃 𝐵 𝑥𝜖𝐵
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Theorem: Given an event space {𝐵𝑖 } and the conditional PDFs 𝑓𝑋|𝐵𝑖 (𝑥),

𝑓𝑋 𝑥 = ෍ 𝑓𝑋|𝐵𝑖 (𝑥)𝑃[𝐵𝑖 ]
𝑎𝑙𝑙 𝑖
Families of Continuous Random
Variables
• Definition: Conditional Expected Value Given an Event
• If {𝑥 ∈ 𝐵}, the conditional expected value of X is
+∞
𝐸 𝑋|𝐵 = න 𝑥𝑓𝑋|𝐵 𝑥 𝑑𝑥
−∞
• To find the variance:
+∞
𝐸 𝑋 2 |𝐵 =න 𝑥 2 𝑓𝑋|𝐵 𝑥 𝑑𝑥
−∞

𝑉𝑎𝑟 𝑋 𝐵 = 𝐸 𝑋 2 |𝐵 − 𝐸 2 𝑋|𝐵
Families of Continuous Random
Variables
• Example: Suppose the duration T (in minutes) of a telephone call is an exponential
(1/3) random variable:

1 −𝑡Τ
𝑒 3 𝑡≥0
• 𝑓𝑇 𝑡 = ൝3
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

• For calls that last at least 2 minutes, what is the conditional PDF of the call
duration? Find the expected value?
Families of Continuous Random
Variables
• Example: The probability density function of random variable Y is
1
𝑓𝑌 𝑦 = ቐ10 0 ≤ 𝑦 < 10
0 𝑜𝑡ℎ𝑒𝑟𝑠
• Find the following:
• (1) P[Y ≤ 6]
• (2) the conditional PDF 𝑓𝑌 |𝑌≤6 (𝑦)
• (3) P[Y > 8]
• (4) the conditional PDF 𝑓𝑌 |𝑌>8 (𝑦)
• (5) E[Y |Y ≤ 6]
• (6) E[Y |Y > 8]
Joint Random Variables

• Joint Cumulative Distribution Function


• Definition: Joint Cumulative Distribution Function (CDF)
• The joint cumulative distribution function of random variables X and Y is
𝐹𝑋,𝑌 𝑥, 𝑦 = 𝑃[𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦]
• The joint CDF is a complete probability model.
• The joint CDF has properties that are direct consequences of the definition. For
example, we note that the event {X ≤ x} suggests that Y can have any value so long
as the condition on X is met. This corresponds to the joint event {X ≤ x, Y < ∞}.
Therefore,
𝐹𝑋 𝑥 = 𝑃 𝑋 ≤ 𝑥 = 𝑃 𝑋 ≤ 𝑥, 𝑌 < ∞ = lim 𝐹𝑋,𝑌 𝑥, 𝑦 = 𝐹𝑋,𝑌 𝑥, ∞
𝑦→∞
• We obtain a similar result when we consider the event {Y ≤ y}.
Joint Random Variables

• Theorem: For any pair of random variables, X, Y ,


1. 0 ≤ 𝐹𝑋,𝑌 𝑥, 𝑦 ≤ 1,
2. 𝐹𝑋 (𝑥) = 𝐹𝑋,𝑌 𝑥, +∞
3. 𝐹𝑌 (𝑦) = 𝐹𝑋,𝑌 (+∞, 𝑦),
4. 𝐹𝑋,𝑌 −∞, 𝑦 = 𝐹𝑋,𝑌 (𝑥, −∞) = 0,
5. If 𝑥 ≤ 𝑥1 and 𝑦 ≤ 𝑦1 , then 𝐹𝑋,𝑌 𝑥, 𝑦 ≤ 𝐹𝑋,𝑌 (𝑥1 , 𝑦1 ),
6. 𝐹𝑋,𝑌 (+∞, +∞) = 1
• Joint Probability Mass Function
• Definition: Joint Probability Mass Function (PMF)
• The joint probability mass function of discrete random variables X and Y is
𝑃𝑋,𝑌 𝑥, 𝑦 = 𝑃[𝑋 = 𝑥, 𝑌 = 𝑦]
Joint Random Variables

• Example: Test two integrated circuits one after the other. On each test, the possible
outcomes are a (accept) and r (reject). Assume that all circuits are acceptable with
probability 0.9 and that the outcomes of successive tests are independent. Count the
number of acceptable circuits X and count the number of successful tests Y before
you observe the first reject. (If both tests are successful, let Y = 2.) Draw a tree
diagram for the experiment and find the joint PMF of X and Y.
Joint Random Variables

• Theorem: For discrete random variables X and Y and any set B in the X, Y plane,
the probability of the event {(X, Y ) ∈ B} is
𝑃 𝐵 =෍ 𝑃𝑋,𝑌 (𝑥, 𝑦)
(𝑥,𝑦)∈𝐵
• Example: Continuing the previous Example, find the probability of the event B that
X, the number of acceptable circuits, equals Y, the number of tests before observing
the first failure.

• Example: The joint PMF 𝑃𝑄,𝐺 (𝑞, 𝑔) for random variables Q and G is given in the
following table:
• Calculate the following
probabilities:
• (1) P[Q = 0] (2) P[Q = G]
• (3) P[G > 1] (4) P[G > Q]
Joint Random Variables

• Marginal PMF
• In an experiment that produces two random variables X and Y , it is always possible to
consider one of the random variables, Y , and ignore the other one, X. In this case, we can use
the methods of Chapter 2 to analyze the experiment and derive 𝑃𝑌 (𝑦), which contains the
probability model for the random variable of interest.

• On the other hand, if we have already analyzed the experiment to derive the joint PMF
𝑃𝑋,𝑌 (𝑥, 𝑦), it would be convenient to derive 𝑃𝑌 (𝑦) from 𝑃𝑋,𝑌 (𝑥, 𝑦) without reexamining the
details of the experiment.

• Theorem: For discrete random variables X and Y with joint PMF 𝑃𝑋,𝑌 (𝑥, 𝑦),
𝑃𝑋 𝑥 = ෍ 𝑃𝑋,𝑌 (𝑥, 𝑦) , 𝑃𝑌 (𝑦) = ෍ 𝑃𝑋,𝑌 (𝑥, 𝑦) ,
𝑦∈𝑆𝑌 𝑋∈𝑆𝑋
• It implies that we can find 𝑃𝑌 (𝑦) by summing 𝑃𝑋,𝑌 (𝑥, 𝑦) over all points in 𝑆𝑋,𝑌 with the
property 𝑌 = 𝑦. In the sum, y is a constant, and each term corresponds to a value of 𝑥 ∈ 𝑆𝑋 .
Similarly, we can find 𝑃𝑋 (𝑥) by summing 𝑃𝑋,𝑌 (𝑥, 𝑦) over all points 𝑋, 𝑌 such that 𝑋 = 𝑥.
Joint Random Variables

• Example: In first Example, we found the joint PMF of X and Y to be

• Find the marginal PMFs for the random variables X and Y.


Joint Random Variables

• The most useful probability model of a pair of continuous random variables is a


generalization of the PDF of a single random variable
• Definition : Joint Probability Density Function (PDF)
• The joint PDF of the continuous random variables X and Y is a function 𝑓𝑋,𝑌 (𝑥, 𝑦)
with the property
𝑥 𝑦
𝐹𝑋,𝑌 𝑥, 𝑦 = න න 𝑓𝑋,𝑌 𝑢, 𝑣 𝑑𝑢𝑑𝑣
−∞ −∞

• We know that 𝑓𝑋,𝑌 𝑥, 𝑦 is a derivative of the CDF

𝜕 2 𝐹𝑋,𝑌 𝑥, 𝑦
𝑓𝑋,𝑌 𝑥, 𝑦 =
𝜕𝑥𝜕𝑦
Joint Random Variables

• Theorem:
𝑃 𝑥1 ≤ 𝑋 ≤ 𝑥2 , 𝑦1 ≤ 𝑌 ≤ 𝑦2
= 𝐹𝑋,𝑌 𝑥2 , 𝑦2 − 𝐹𝑋,𝑌 𝑥2 , 𝑦1 − 𝐹𝑋,𝑌 𝑥1 , 𝑦2 + 𝐹𝑋,𝑌 (𝑥1 , 𝑦1 )
• Theorem:
• A joint PDF 𝑓𝑋,𝑌 𝑥, 𝑦 has the following properties corresponding to first and
second axioms of probability
• 𝑓𝑋,𝑌 𝑥, 𝑦 ≥ 0 𝑓𝑜𝑟 𝑎𝑙𝑙 𝑥, 𝑦
∞ ∞
• ‫׬‬−∞ ‫׬‬−∞ 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦 = 1

• Theorem: The probability that the continuous random variables (X, Y ) are in A is

𝑃 𝐴 = ඵ 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦
𝐴
Joint Random Variables

• Example: Random variables X and Y have joint PDF


𝑐 0 ≤ 𝑥 ≤ 5, 0 ≤ 𝑦 ≤ 3
𝑓𝑋,𝑌 𝑥, 𝑦 = ቊ
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Find the constant c and 𝑃[𝐴] = 𝑃[2 ≤ 𝑋 < 3, 1 ≤ 𝑌 < 3].
• What is P[A] = P[Y > X]?
Joint Random Variables

• Marginal PDF
• Same as the marginal PMF, sometimes we need to find marginal PDF 𝑓𝑋 (𝑥) or
𝑓𝑌 (𝑦) from the joint PDF 𝑓𝑋,𝑌 (𝑥, 𝑦). Therefore, we refer to 𝑓𝑋 (𝑥) and 𝑓𝑌 (𝑦) as the
marginal probability density functions of 𝑓𝑋,𝑌 (𝑥, 𝑦).

• If X and Y are random variables with joint PDF 𝑓𝑋,𝑌 (𝑥, 𝑦),
∞ ∞
𝑓𝑋 𝑥 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑦 𝑓𝑌 𝑦 = න 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥
−∞ −∞
Joint Random Variables

• Example: The joint PDF of X and Y is


5𝑦ൗ 2
𝑓𝑋,𝑌 𝑥, 𝑦 = ൝ 4 −1 ≤ 𝑥 ≤ 1,𝑥 ≤ 𝑦 ≤ 1
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
• Find the marginal PDFs 𝑓𝑋 𝑥 and 𝑓𝑌 𝑦 .

You might also like