0% found this document useful (0 votes)
47 views19 pages

Lecture No 37 - November 13, 2023

1) The document summarizes key concepts related to continuous random variables including joint and marginal probability density functions (PDFs), conditional PDFs, independence, total probability and expectation theorems, and examples calculating these metrics for uniform distributions. 2) It provides examples of calculating the joint PDF, marginal PDFs, conditional PDF, expected value given another random variable, and using the total expectation theorem for a uniform distribution over a triangle. 3) The document reviews important rules for working with continuous random variables including how to calculate probabilities, densities, expectations, and independence.

Uploaded by

Noor Ghazi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views19 pages

Lecture No 37 - November 13, 2023

1) The document summarizes key concepts related to continuous random variables including joint and marginal probability density functions (PDFs), conditional PDFs, independence, total probability and expectation theorems, and examples calculating these metrics for uniform distributions. 2) It provides examples of calculating the joint PDF, marginal PDFs, conditional PDF, expected value given another random variable, and using the total expectation theorem for a uniform distribution over a triangle. 3) The document reviews important rules for working with continuous random variables including how to calculate probabilities, densities, expectations, and independence.

Uploaded by

Noor Ghazi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

EE 354/CE 361/MATH 310 –

Introduction to Probability

and Statistics
EE 354/CE 361/MATH 310 L1 section (Fall 2023) –

Introduction to Probability and Statistics -

Aamir Hasan

Week 13 - Lecture No 37 – 13th November 2023


Announcements!
➢ Quiz No 07 on November 22, 2023 – Good Luck

➢ Quiz No 08 on December 06, 2023 – Good Luck

➢ Practice Class will be on December 08, 2023

Agenda for today


• Unit 5: Continuous Random Variables
• Joint PDF’s & Marginal PDF’s

3
• Unit 5: Continuous Random Variables

Completed
• Unit 5: Continuous Random Variables
More than two random variables

pX,Y,Z x, y, z 𝑓𝑋,𝑌,𝑍 (𝑥, 𝑦, 𝑧)

෍ ෍ ෍ 𝑝𝑋,𝑌,𝑍 𝑥, 𝑦, 𝑧 = 1
𝑥 𝑦 𝑧

Sum becomes integrals & pmfs


𝑝𝑋 𝑥 = ෍ ෍ 𝑝𝑋,𝑌,𝑍 𝑥, 𝑦, 𝑧 becomes pdfs
𝑧 𝑦

𝑝𝑋,𝑌 𝑥, 𝑦 = ෍ 𝑝𝑋,𝑌,𝑍 (𝑥, 𝑦, 𝑧)


𝑧

6
Functions of multiple random variables

𝑍 = 𝑔(𝑋, 𝑌)

Expected value rule:

𝐸 𝑔 𝑋, 𝑌 = ෍ ෍ 𝑔 𝑥, 𝑦 𝑝𝑋,𝑌 (𝑥, 𝑦) 𝐸 𝑔 𝑋, 𝑌 = න න 𝑔 𝑥, 𝑦 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦


𝑥 𝑦

Linearity of expectations
𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 + 𝑏

𝐸 𝑋+𝑌 =𝐸 𝑋 +𝐸 𝑌
𝑬 𝑿𝟏 + … + 𝑿𝒏 = 𝑬 𝑿𝟏 + … + 𝑬[𝑿𝒏 ] 7
The joint CDF
𝑥 𝑑𝐹𝑋 𝑥
𝐹𝑋 𝑥 = 𝑃 𝑋 ≤ 𝑥 = න 𝑓𝑋 𝑡 𝑑𝑡 𝑓𝑋 𝑥 =
−∞
𝑑𝑥
y x
𝐹𝑋,𝑌 𝑥, 𝑦 = 𝑃 𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦 = න න fX,Y s,t ds dt
−∞ −∞

𝜕 2 𝐹𝑋,𝑌 1
𝑓𝑋,𝑌 𝑥, 𝑦 = (𝑥, 𝑦)
𝜕𝑥𝜕𝑦
𝑦

𝑓𝑋,𝑌 𝑥, 𝑦 = 1, for 0 < (x, y) < 1


FX,Y x,y =xy 𝑥 1

fX,Y x,y =1 8
Conditional PDFs, given another r.v.
𝑝𝑋,𝑌 𝑥, 𝑦
𝑝𝑋|𝑌 𝑥|𝑦 = 𝐏 𝑋 = 𝑥 | 𝑌 = 𝑦 = Py y >0
𝑝𝑌 𝑦

𝑝𝑋,𝑌 𝑥, 𝑦 𝑓𝑋,𝑌 𝑥, 𝑦
𝑝𝑋|𝐴 𝑥 𝑓𝑋|𝐴 𝑥
𝑝𝑋|𝑌 𝑥|𝑦 𝑓𝑋|𝑌 𝑥|𝑦

𝑓𝑋,𝑌 𝑥, 𝑦 if f𝑦 y >0
𝑓𝑋|𝑌 𝑥|𝑦 =
𝑓𝑌 𝑦
𝐏 x ≤ X ≤ 𝑥 + 𝛿 | 𝐴 ≈ 𝑓𝑋 | A 𝑥 𝛿 where P A >0

If Y = y; P(Y=y) = 0 So Y ≈ y

fX,Y x,y δ∆
𝐏 x ≤X≤𝑥+𝛿|y ≤Y≤𝑦+∆ = =f x|y δ
fY y ∆ X|Y
Conditional PDFs, given another r.v.

P 𝑋 ∈ 𝐵 | 𝑌 = 𝑦 =න 𝑓𝑋|𝑌 (𝑥|𝑦)𝑑𝑥
𝐵

• Some comments:
𝑓𝑋,𝑌 𝑥, 𝑦 • f x | y ≥0
𝑓𝑋|𝑌 𝑥|𝑦 =
𝑓𝑌 𝑦 X|Y

𝑥
Conditional PDFs, given another r.v.

P 𝑋 ∈ 𝐵 | 𝑌 = 𝑦 =න 𝑓𝑋|𝑌 (𝑥|𝑦)𝑑𝑥
𝐵

• Some comments:
𝑓𝑋,𝑌 𝑥, 𝑦 • f x | y ≥0
𝑓𝑋|𝑌 𝑥|𝑦 =
𝑓𝑌 𝑦 X|Y

• Think of the value of Y fixed at y


shape of 𝑓𝑋|𝑌 (. |𝑦) : slice of the joint

‫׬‬−∞ 𝑓𝑋,𝑌 (𝑥,𝑦)d𝑥

• ‫׬‬−∞ 𝑓𝑋|𝑌 (𝑥|𝑦)d𝑥 = =1
𝑓𝑌 (𝑦)
𝑥
• Multiplication rule:
𝑓𝑋,𝑌 𝑥, 𝑦 = 𝑓𝑋|𝑌 𝑥|𝑦 𝑓𝑌 𝑦
= 𝑓𝑌|𝑋 𝑦|𝑥 𝑓𝑋 𝑥
Expected Value Rule

𝐄 𝑔 𝑋 | 𝑌 = 𝑦 = ෍ 𝑔 𝑥 𝑝𝑋|𝑌 𝑥 | 𝑦
𝑥


E g X |Y=y = න g x f (x|y) dx
X|Y
−∞
Independence

𝑝𝑋,𝑌 𝑥, 𝑦 = 𝑝𝑋 (𝑥) 𝑝𝑌 (𝑦) for all x, y

𝑓𝑋,𝑌 𝑥, 𝑦 = 𝑓𝑋 𝑥 𝑓𝑌 𝑦 for all x, y

𝑓𝑋,𝑌 𝑥, 𝑦 = 𝑓𝑋|𝑌 𝑥|𝑦 𝑓𝑌 𝑦 always true

𝑓𝑋 𝑥 = 𝑓𝑋|𝑌 𝑥|𝑦 for all y with f(y) > 0 and all x

𝑓𝑌 𝑦 = 𝑓𝑌|𝑋 𝑦|𝑥 for all x with f(x) > 0 and all y


Total probability & expectation theorem

𝑝𝑋 𝑥 = ෍ 𝑝𝑌 𝑦 𝑝𝑋|𝑌 𝑥|𝑦 fX x = න fY y f (x|y)dy
X|Y
𝑦 −∞

𝐄 𝑋 | 𝑌 = 𝑦 = ෍ 𝑥 𝑝𝑋|𝑌 𝑥 | 𝑦 E X | Y=y = න xf (x|y)dx
X|Y
𝑥
∞ −∞

𝐄 𝑋 = ෍ 𝑝𝑌 𝑦 𝐄 𝑋 | 𝑌 = 𝑦 E X = න fY y E X | Y=y dy
𝑦 −∞
How do you prove the last statement?

Think Discuss Prove


Proof
∞ ∞ ∞
𝐄 𝑋 = න 𝑓𝑌 𝑦 𝐄 𝑋 | 𝑌 = 𝑦 d𝑦 = න fY y න xf x y dx dy
X|Y
−∞ −∞ −∞
∞ ∞
= න න x fY y f x y dy dx
X|Y
−∞ −∞
∞ ∞
= න x න fY y f x y dy dx
X|Y
−∞ −∞

= න xfX x dx = E X
−∞
Example
𝑦
Uniform probabilities on a triangle (0, 0)
(0, 1)
𝑥+𝑦 =1
find:
uniform
a) 𝑓𝑋,𝑌 (𝑥, 𝑦)
2, x+y< 1, (1, 0) 𝑥
fX,Y x, y = ൞ x, y>0
0, otherwise
𝑓𝑌 (𝑦)
b) 𝑓𝑌 (𝑦)
(0, 2)
1−y
fY (y) = ‫׬‬0 fX,Y x, y dx

2(1−y), 0≤y≤1
=൝
0, otherwise (1, 0) 𝑦
Example
c)𝑓𝑋|𝑌 𝑥|𝑦
fX,Y x,y
f x|y =
0 ≤ 𝑦 < 1;
X|Y fY y 0≤ 𝑥+𝑦 ≤1 𝑓𝑋|𝑌 𝑥|𝑦

1
2 1−y
=
2(1−y)
1-y 1 x

1
, 0 ≤ y ≤ 1; 0 ≤ x ≤ 1−y
=൞(1−y)
0, otherwise
Example

d) 𝐸 𝑋|𝑌 = 𝑦

E X | Y=y = න xf (x|y)dx
X|Y
−∞

1−y
1
E X | Y=y = න x dx
(1−y)
0
1 (1−y)2
=
(1−y) 2

(1−y) 0≤y≤1
=
2
Example

e) 𝐸 𝑋 ∞
E X = න fY y E X | Y=y dy
−∞
∞ ∞ ∞
(1−y) 1 1
= න fY y dy = න fY y dy − න y fY y dy
2 2 2
−∞ −∞ −∞
1 1 1 1
= − E Y = − E X
2 2 2 2
1 1
(1 + ) E X =
2 2
1 1
E X = E Y =
3 3

You might also like