0% found this document useful (0 votes)
12 views4 pages

Poisson, Exponential Gamma Distributions From CH 4

The document discusses three probability distributions: the Poisson distribution, which gives the probability of a certain number of occurrences within a time interval; the Exponential distribution, which gives the probability of the time until the first occurrence; and the Gamma distribution, which gives the probability of the time until the αth occurrence. These distributions are related to modeling events that occur randomly over time at a constant average rate.

Uploaded by

Hashem Mansi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views4 pages

Poisson, Exponential Gamma Distributions From CH 4

The document discusses three probability distributions: the Poisson distribution, which gives the probability of a certain number of occurrences within a time interval; the Exponential distribution, which gives the probability of the time until the first occurrence; and the Gamma distribution, which gives the probability of the time until the αth occurrence. These distributions are related to modeling events that occur randomly over time at a constant average rate.

Uploaded by

Hashem Mansi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Stat 230 - Summer 2022 American University of Beirut Karakazian

Poisson, Exponential & Gamma Distributions

Summary:
a) The Poisson distribution gives the pdf for the number of occurrences (of a certain event) in a
given time interval of the length t, where the average rate of occurrences is α per unit of time.

b) The Exponential distribution gives the pdf for the time until the rst occurrence, where the av-
erage rate of occurrences is α per unit of time (which is λ in our textbook).

c) The Gamma distribution gives the pdf for the time until the α th occurrence, where the average
rate of occurrence is 1/β per unit of time.

1. Poisson Processes
Consider an event (such as a car passing or a customer arriving) that occurs μ > 0 times on av-
erage during a time interval of xed length I. We showed in section 3.6, that the random variable

X = # of times this event might occur in a given time interval of same length I
is such that
μ k e −μ
X ∼ Pois(μ), i.e. ℙ(X = k) = for k = 0,1,2,...
k!
In general, we may want to know how many times this event might occur in a given time interval
of length t which is not necessarily equal to I. For this purpose, we introduce a new parameter

α = average rate of occurrence of this event per unit of time,


and set
μ = αt = rate of occurrence of this event per t units of time.
Thus the random variable

Xt = # of times this event might occur in a given time interval of the length t

is such that
(αt)k e −αt
Xt ∼ Pois(αt), i.e. ℙ(Xt = k) = for k = 0,1,2,...
k!
In this context, Xt is called the Poisson random variable with parameter μ = αt. The occurrence
of events over time is called a Poisson process and the parameter α is the rate of the process. In
a Poisson process, we assume that the number of events that might occur during a given time
interval is independent of the number that have already occurred prior to this time interval. This
assumption was used in the form of independent Bernoulli trials while we were deriving the Pois-
son distribution in Section 3.6 as some limit of the Binomial Distribution. FYI, there is a whole eld
of mathematics dedicated just to Stochastic Processes.
fi
fi
fi
Stat 230 - Summer 2022 American University of Beirut Karakazian

Example 1: Suppose customers arrive at an average rate of 15 per hour. If the unit of time is in
minutes, then α = 15/60 = 0.25customers/min. What’s the chance that in a given 3-minute in-
terval, at least two customers might arrive? i.e. t = 3min.

Answer: ℙ(X3 ≥ 2) = 1 − ℙ(X3 < 2)


= 1 − ℙ(X3 = 1) − ℙ(X3 = 0)
= 1 − 0.75e −0.75 − e −0.75
≈ 0.17336

2. The Exponential Distribution


Suppose now, in the context of a Poisson process, instead of focusing on the number of occur-
rences of some speci ed event in a given time interval, we focus the (waiting) time T until the rst
occurrence, i.e. we focus on the non-negative continuous random variable

T = (waiting) time until the rst occurrence of that event

To calculate the pdf of T, we rst calculate its cdf and then derive it.

Observe that for any t ≥ 0, the cdf of T is the probability that the next event occurs before time t
and is given by

F(t; α) = ℙ(T ≤ t) = 1 − ℙ(T > t)


= 1 − ℙ(no event in (0,t)]
Useful Fact: The probability (using the Poisson distribution, this becomes)
that then next event occurs = 1 − ℙ(Xt = 0)
at least after time t is (αt)0e −αt f (t; α)
ℙ(T > t) = e −αt. =1−
0!
=1−e −αt α

and so the pdf of T is given by

{0
αe −αt if t ≥ 0
f (t; α) = t
otherwise

We call this the Exponential distribution with parameter α, and write

T ∼ Exp(α)
Using integration by parts, it can be shown that

1 1
∫0
E(T ) = tαe −αt dt = and V(T ) =
α α2
fi
fi
fi
fi
Stat 230 - Summer 2022 American University of Beirut Karakazian

Example 2: In the context of Example 1 above (where α = 0.25customers/min), we expect that


the next customer will arrive exactly after 1/α = 4min. Of course, because customers arrive at
an average rate of 15 per hour, and there are fteen 4-min intervals in an hour.

a) What’s the chance that the next customer might arrive at least after a minute?
Answer: ℙ(T > 1) = e −0.25×1 = e −0.25 (by the ‘Useful Fact’)

b) What’s the chance that the next customer might arrive within 3 to 5 minutes?
Answer: ℙ(3 ≤ T ≤ 5) = ℙ(T > 3) − ℙ(T > 5) = e −0.25×3 − e −0.25×5 = e −0.75 − e −1.25

The Memoryless Property of the Exponential Distribution:


The exponential distribution has a property referred as the “memoryless property”, meaning that
if we haven’t been observing when the events were occurring for a period of length t0 and sud-
denly decided to begin observing, then the distribution of time T + t0 of the next occurrence has
the same original distribution as T. In other words in the context of Examples 1 and 2 above, it
doesn’t matter when past customers arrived as long as we weren’t observing. This is because,
we have

ℙ({T > t + t0} ∩ {T > t0}) ℙ(T > t + t0) e −α(t+t0 )


ℙ(T > t + t0 | T > t0) = = = = e −αt0 = ℙ(T > t0)
ℙ(T > t0) ℙ(T > t0) e −αt

3. The Gamma Distribution


Suppose now, instead of focusing the (waiting) time until the rst occurrence of some event, we
focus on
Tα = (waiting) time until the α th occurrence of that event

It turns out that (after similar but longer derivation), the distribution of Tα is given by the so-called
Gamma Distribution
t α−1e −t/β

{0
for t ≥0
f (t; α, β ) = β α (α − 1)!
otherwise

where 1/β ( = α of the exponential distribution) is the average rate of occurrences of that event.
In this context, β is referred as the scaling factor, and we write

Tα ∼ Gamm a(α, β )

Its name Gamma comes from the Gamma function, as the term (α − 1)! in the denominator
could be further generalized to Γ(α) for any real number α > 0. Using integration by parts, it can
be shown that

E(Tα ) = αβ and V(Tα ) = αβ 2

Note that the gamma distribution can be also used to model the extent of survival time (death be-
ing the occurring event). Here, smaller β means faster death rate, and so larger probability for
smaller survival times (see graph below).
fi
fi
Stat 230 - Summer 2022 American University of Beirut Karakazian

Regarding probability calculations, one can either use the online applet or the table for Incom-
plete Gamma Function which gives the cdf of the standard gamma distribution (when β = 1)

F(t; α) := ℙ(Tα ≤ t).

The cdf of the gamma distribution is then given by

F(t; α, β ) := F(t /β; α).

Example 3: Example 4.24 from textbook.

You might also like