0% found this document useful (0 votes)
7 views46 pages

Lecture04 Ch 04 ContinuousDistributions Baron Inf Stats FA24

Uploaded by

muqeemahmed.co
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views46 pages

Lecture04 Ch 04 ContinuousDistributions Baron Inf Stats FA24

Uploaded by

muqeemahmed.co
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Probability and Statistics for Computer

Scientists
Third Edition, By Michael Baron

Chapter 4:
Continuous Distributions
Continuous Random Variables
• A continuous random variable can take any value in an
interval, open or closed, so it has innumerable values
 Examples: the height or weight of a chair
• For such a variable X, the probability assigned to an exact
value P(X = a) is always 0, though the probability for it to
fall into interval [a, b], that is, P(a ≤ X ≤ b), can be a
positive number
• Also the CDF for such a variable is a continuous function
given as:
Quiz 3.1: Yates and Goodman
Quiz 3.1: Solution
Probability Density Function
One way to get P(a ≤ X ≤ b): to integrate the
probability density function of X
Probability Density Function (2)

P(a ≤ X ≤ b) = P(a < X ≤ b) = P(a < X < b) = P(a ≤ X < b)


Probability Density Function (3)
Cumulative Distribution Function
F(b) is still defined as P(X ≤ b), or P(-∞ < X ≤ b)
CDF (2)
Another way to get P(a ≤ X ≤ b):
P(a < X ≤ b) = P(X ≤ b) – P(X ≤ a) = F(b) – F(a)
A discrete random variable has no pdf f(x), a
continuous random variable has no useful pmf
p(x), but both have cdf F(x)
P p Discrete
f F
P(E) p(a) = P(X = a)

Continuous f(a) ≈ P(a-ε < X < a+ε)/2ε F(a) = P(X ≤ a)


PMF p(x) versus PDF f(x)
Review: Derivative and Integral
• Derivatives of elementary functions
power, exponential and logarithmic functions
• Rules for finding the derivative
Combined functions
• Integral as antiderivative
Especially, for power function xt
𝑏 t
‫ 𝑎׬‬x dx = (bt+1 – at+1) / (t+1) when t ≠ -1
𝑏 −1 𝑏
‫ 𝑎׬‬x dx = ‫ 𝑎׬‬1/x dx = ln(b) – ln(a)
Example 4.1 (1): M. Baron

TRY!
Example 4.1 (1): M. Baron
Example 4.1 (2)
Quiz 3.2: Yates and Goodman

Try yourself!
Joint distributions: Continuous
Joint distributions: Continuous (2)
PMF p(x) versus PDF f(x): Joint
Axioms: Joint distribution
Ex 4.4: Yates and Goodman

TRY!
Ex 4.4: Solution
Ex 4.6: Yates and Goodman
Ex 4.6: Solution
Ex 4.7: Yates and Goodman
Expectation of Continuous Variable
p(x) vs. f(x): E[X] and Var(X)
Example 4.2
Uniform Distribution

The CDF of a random variable that has a U(α, β)


distribution is given by
F(x) = 0 if x < α
F(x) = (x − α) / (β − α) if α ≤ x ≤ β
F(x) = 1 if x > β
Uniform Distribution (2)
Uniform Distribution (3)
U(0, 1) is called Standard Uniform distribution
(uniform distribution with a=0 and b=1)
Its density is f(x) = 1 for 0 < x < 1
If X is U(a, b), then Y = (X – a) / (b – a) is U(0, 1)
Exponential Distribution
When the number of events is Poisson, the time
between events is exponential

E[X] = 1 / λ, Var(X) = 1 / λ2
Exponential Distribution (2)
Exponential Distribution (3)
PDF and CDF
Example 3.12: Yates, Goodman

𝑑𝐹𝑇 (𝑡)
Recall: 𝑓𝑇 𝑡 = and 𝑃 2 ≤ 𝑡 ≤ 4 = 𝐹4 4 − 𝐹2 (2)
𝑑𝑡
Example 3.12: Solution
Gamma Distribution
When a process consists of α independent
steps, and each step takes Exp(λ) amount of
time, then the total time has a distribution
Gamma(α, λ), though α may not be an integer
Gamma Distribution (2)
• Gamma distribution can be widely used for the total time
of a multistage scheme, e.g. related to downloading or
installing a number of files.
• In a process of rare events, with Exponential times
between any two consecutive events, the time of the α-th
event has Gamma distribution because it consists of α
independent Exponential times.
• Besides the case when a Gamma variable represents a
sum of independent Exponential variables, Gamma
distribution is often used for the amount of money being
paid, amount of a commodity being used (gas, electricity,
etc.), a loss incurred by some accident, etc.
Gamma Distribution (3)
Normal Distribution
Normal (Gaussian) distribution N(μ,2) is often
used as a model for physical variables like weight,
height, temperature, or examination grade.
Normal Distribution (2)
Normal Distribution (3)
Bin(n, p) ≈ N(np, np(1 – p)) when n is large, and
p is moderate.
N(0, 1) is called Standard Normal distribution,
written as φ(x).
Central Limit Theorem
• The Central Limit Theorem (CLT) states that, in most
situations, when many independent random variables
of the same type are added, their properly normalized
sum tends towards a normal distribution, even if the
original variables themselves are not normally
distributed, that is, they can have any distribution
• This theorem is very powerful because it can be
applied to RVs X1, X2,……. having virtually any thinkable
distribution with finite expectation and variance
• As long as n is large (the rule of thumb is n > 30), one
can use Normal distribution to compute probabilities
about the sum (Sn)
Central Limit Theorem (2)
Let X1, X2,… be independent random variables
with the same expectation μ = E(Xi) and the
same standard deviation σ = Std(Xi), and let

As n → ∞, the standardized sum

converges in distribution to a Standard Normal


random variable for all z
Central Limit Theorem (3)
Summary
1. Continuous Random Variable X
P(a ≤ X ≤ b) from fX and FX
2. Random Vector (X, Y)
f(X,Y) and F(X,Y), Independence
3. Features
E[X], Var(X)
4. Families
U(α, β), Exp(λ), Gamma(α, λ), N(μ, 2)
Summary (2)

FAMILY f(x) E[X] Var(X)

U(α, β) 1 / (β − α) for α ≤ x ≤ β (α + β) / 2 (β − α)2 / 12

Exp(λ) λe-λx for λ ≥ 0 1/λ 1 / λ2

Gamma(α, λ) α/λ α / λ2

N(μ, 2) μ 2

You might also like