0% found this document useful (0 votes)
6 views

확통1 LectureNote07 on Memoryless Random Processes

Uploaded by

jedem10224
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

확통1 LectureNote07 on Memoryless Random Processes

Uploaded by

jedem10224
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Memoryless Random Processes

Memoryless Random Processes


Definition:
Sequence (or function) of random variables which has a special
memoryless property: {𝑋𝑛 , 𝑛 ≥ 1} or {𝑋 𝑡 , 𝑡 ≥ 0}

Examples:
• Arrival example: Number of arrivals to a bank at time 𝑡
• Queuing example: Length of a waiting line at a bank at time 𝑛

• Gambler’s ruin: The gambler’s fortune at time 𝑛 (Markov


Chains)
• Engineering example: Signal corrupted with noise at time 𝑡
2
Bernoulli Process
• A sequence of independent Bernoulli trials.
• At each trial:
− 𝐏 success = 𝐏 𝑋 = 1 = 𝑝
− 𝐏 failure = 𝐏 𝑋 = 0 = 1 − 𝑝
• Examples:
− Sequence of ups and downs of stock markets
− Sequence of lottery wins/losses
− Arrivals (each second) to a bank.

3
Number of successes S in n time slots

𝑛
• 𝐏 𝑆=𝑘 = 𝑘
𝑝𝑘 (1 − 𝑝)𝑛−𝑘 , 𝑘 = 0,1, ⋯ , 𝑛.
(Binomial)

• Mean: 𝐄 𝑆 = 𝑛𝑝

• Variance: 𝐕 𝑆 = 𝑛𝑝(1 − 𝑝)

4
Interarrival Times

• 𝑇1 : number of trials until first success or the first


interarrival time
• Interarrival times have a memoryless property

• 𝐏 𝑇1 = 𝑡 = 𝑝(1 − 𝑝)𝑡−1 , 𝑡 = 1,2, ⋯. (Geometric)

• Mean: 𝐄 𝑇1 = 1/𝑝

• Variance: 𝐕 𝑇1 = (1 − 𝑝)/𝑝2
5
Fresh Start and Memoryless Properties

• Fresh Start:
Given 𝑛, the process is said to fresh start if the future sequence
𝑋𝑛+1 , 𝑋𝑛+2 , ⋯ is also a Bernoulli process and is independent
of the past {𝑋𝑘 , 0 ≤ 𝑘 ≤ 𝑛}.

• Memoryless:
Suppose we observe the process for 𝑛 times and no success
occurred. Then the process is said to be memoryless if the
PMF of the remaining time 𝑇1 for the first arrival does not
depend on 𝑛:
𝐏 𝑇1 = 𝑛 + 𝑡 𝑇1 > 𝑛 = 𝐏(𝑇1 = 𝑡)

6
𝑡ℎ
Time of the 𝑘 Arrival (1)

• 𝑌𝑘 : number of trials until the 𝑘 𝑡ℎ success


• 𝑇𝑘 = 𝑌𝑘 − 𝑌𝑘−1 , 𝑘 = 2,3, ⋯ ; 𝑘 𝑡ℎ interarrival time

• It follows that:
𝑌𝑘 = 𝑇1 + 𝑇2 + ⋯ + 𝑇𝑘

7
Time of the 𝑘 𝑡ℎ Arrival (2)

𝑡−1
• 𝐏[𝑌𝑘 = 𝑡] = 𝑘−1
𝑝𝑘 (1 − 𝑝)𝑡−𝑘 , 𝑡 = 𝑘, 𝑘 + 1, ⋯ (Pascal)

• Mean: 𝐄 𝑌𝑘 = 𝑘/𝑝

• Variance: 𝐕 𝑌𝑘 = 𝑘(1 − 𝑝)/𝑝2

8
Bernoulli Process: Summary

• Discrete time process; success probability in each slot = 𝑝


• PMF of the number of arrivals in 𝑛 time slots: Binomial
• PMF of interarrival time: Geometric
• PMF of time to 𝑘 𝑡ℎ arrival: Pascal
• Memoryless
• Then, what about continuous arrival times? See next
− Example: no. of customer arrivals to a bank.
9
Poisson Process: Definition

• Let 𝐏 𝑘, 𝜏 the probability of 𝑘 arrivals (or events) in an


interval of duration 𝜏.
• Assumptions:
− The number of arrivals in disjoint
. time intervals are
independent (This is called independent increments)
− For very small 𝛿, we have
1 − 𝜆𝛿, if 𝑘 = 0
− 𝐏 𝑘, 𝛿 ≈ ቐ𝜆𝛿, if 𝑘 = 1
0, if 𝑘 ≥ 2
where 𝜆 = “arrival rate” of the process
− Note that the number of arrivals depends only on the
duration 𝛿 (This is called stationary increments) 10
From Bernoulli to Poisson (1)

• Bernoulli: Arrival probability in each time slot = 𝑝

• Poisson: Arrival probability in each 𝛿-interval = 𝜆𝛿

• Let 𝑛 = 𝑡/𝛿 and 𝑝 = 𝜆𝛿 = 𝜆𝑡/𝑛 :

Number of arrivals Number of successes


in a 𝑡-interval = in 𝑛 time slots
(Poisson) (Binomial)
11
From Bernoulli to Poisson (2)

• Number of arrivals in a 𝑡-interval as 𝑛 → ∞ :

12
PMF of the Number of Arrivals

• 𝑁 = the number of arrivals in a 𝜏-interval:


𝐏 𝑁 = 𝑘 = 𝐏 𝑘, 𝜏 = (𝜆𝜏)𝑘 𝑒 −𝜆𝜏 /𝑘!, 𝑘 = 0,1, ⋯ (Poisson)

• Mean: 𝐄[𝑁] = 𝜆𝜏

• Variance: 𝐕 𝑁 = 𝜆𝜏
𝑠)
• Transform: 𝑀𝑁 𝑠 = 𝑒 −𝜆𝜏(1−𝑒

13
Email Example

• You get email according to a Poisson process, at a rate of 0.4


messages per hour (𝜆 = 0.4, unit time = an hour). You check
your email every thirty minutes (𝜏 = half an hour).

0.4
− 𝜆𝜏 = = 0.2
2

(0.2)0 𝑒 −0.2
− P[no new messages] = = 𝑒 −0.2 = 0.819
0!

(0.2)1 𝑒 −0.2
− P[one new message] = = 0.2 ∙ 𝑒 −0.2 = 0.164
1!

14
Interarrival Time (1)

𝑡
• 𝑌1 : time of the first arrival

• “First order” interarrival time:


𝑓𝑌1 𝑡 = 𝜆𝑒 −𝜆𝑡 , 𝑡 ≥ 0. (Exponential)

• Why ?
𝐏 𝑌1 ≤ 𝑡 = 1 − 𝐏 0, 𝑡 = 1 − 𝑒 −𝜆𝑡

15
Interarrival Time (2)

• Fresh Start Property: The time of the next arrival is


independent from the past.

• Memoryless property: Suppose we observe the process


for 𝑇 seconds and no success occurred. Then the density of
the remaining time for arrival is exponential.

• Email Example: You start checking your email. How long


will you wait, in average, until you receive your next email?
1
𝐄 𝑌1 = = 2.5 hours
0.4
16
Time of the 𝑘 𝑡ℎ Arrival (1)

𝑌𝑘

• 𝑌𝑘 : time of the 𝑘 𝑡ℎ arrival

• 𝑇𝑘 = 𝑌𝑘 − 𝑌𝑘−1 = 𝑘 𝑡ℎ interarrival time, 𝑘 = 2,3, ⋯

• It follows that:
𝑌𝑘 = 𝑇1 + 𝑇2 + ⋯ + 𝑇𝑘

17
Time of the 𝑘 𝑡ℎ Arrival (2)
𝑘 𝑘+1

𝑌𝑘
𝑁𝑡 events

𝜆𝑡 𝑘−1 𝜆𝑒 −𝜆𝑡
• 𝑓𝑌𝑘 𝑡 = , 𝑡≥0 (Erlang of order 𝒌)
𝑘−1 !
∞ ∞
(𝜆𝑡)𝑛 𝑒 −𝜆𝑡
(∵)𝐹𝑌𝑘 𝑡 = 𝐏 𝑌𝑘 ≤ 𝑡 = 𝐏 𝑁𝑡 ≥ 𝑘 = ෍ 𝐏(𝑛, 𝑡) = ෍
𝑛!
𝑛=𝑘 𝑛=𝑘
∞ ∞
𝑛𝜆(𝜆𝑡)𝑛−1 −𝜆𝑡 𝜆𝑡 𝑛 −𝜆𝑡
𝑓𝑌𝑘 (𝑡) = ෍ 𝑒 −෍ λ𝑒
𝑛! 𝑛!
𝑛=𝑘 𝑛=𝑘
∞ (λ𝑡)𝑛−1 ∞ 𝜆𝑡 𝑛 −λ𝑡
= ෍ λ𝑒 −λ𝑡 − ෍ λ𝑒
𝑛=𝑘 𝑛 −1 ! 𝑛=𝑘 𝑛!
𝑘−1
(λ𝑡)
= λ𝑒 −λ𝑡
𝑘−1 !
18
Poisson Process: Summary

• Number of arrivals in disjoint time intervals are independent


1 − 𝜆𝛿, if 𝑘 = 0
𝐏 𝑘, 𝛿 ≈ ቐ𝜆𝛿, if 𝑘 = 1 (for very small 𝛿)
0, if 𝑘 ≥ 2
𝐏 𝑘, 𝜏 = (𝜆𝜏)𝑘 𝑒 −𝜆𝜏 /𝑘! (Poisson)

• Interarrival times (𝑘 = 1):


𝑓𝑌1 𝑡 = 𝜆𝑒 −𝜆𝑡 , 𝑡 ≥ 0 (Exponential)

• Time to the 𝑘 𝑡ℎ arrival:


𝑓𝑌𝑘 (𝑡) = 𝜆(𝜆𝑡)𝑘−1 𝑒 −𝜆𝑡 / 𝑘 − 1 ! , 𝑡 ≥ 0 (Erlang)
19
Bernoulli vs. Poisson

Bernoulli Poisson
Arrival Times Discrete Continuous
Arrival Rate 𝑝/trial 𝜆/unit time
PMF of Arrival Numbers Binomial Poisson
PMF of Interarrival Time Geometric Exponential
PMF of the 𝑘th Arrival Time Pascal Erlang

20
Random Incidence Paradox (1)
𝑋𝑠
𝑡𝑛−2 𝑡𝑛−1 𝑡∗ 𝑡𝑛 𝑡𝑛+1
𝑋𝑛−1 𝑋𝑛+1
Elapsed time Chosen time instant Residual time
𝑋𝑒 = 𝑡 ∗ − 𝑡𝑛−1 𝑌 = 𝑡𝑛 − 𝑡 ∗
• Let us fix a time instance 𝑡 ∗ and consider the length 𝑋𝑠 of
the selected interarrival interval that contains 𝑡 ∗ .
• We argue that 𝑋𝑠 is the length of a typical interarrival time,
and is exponentially distributed, but this is false.
• Let [𝑡𝑛−1 , 𝑡𝑛 ] be the interval that contains 𝑡 ∗ , so that 𝑋𝑠 =
𝑡𝑛 − 𝑡𝑛−1 . We split 𝑋𝑠 into two parts,
𝑋𝑠 = 𝑋𝑒 + 𝑌 = 𝑡 ∗ − 𝑡𝑛−1 + (𝑡𝑛 − 𝑡 ∗ )
where 𝑋𝑒 = 𝑡 ∗ − 𝑡𝑛−1 is the elapsed time since the (𝑛 − 1)th
arrival, and 𝑌 = 𝑡𝑛 − 𝑡 ∗ is the residual time until the 𝑛th
arrival. 21
Random Incidence Paradox (2)
• By the independence property of the Poisson process, the two
parts are independent and are exponentially distributed rvs
• By the memoryless property, the Poisson process starts fresh
at time 𝑡 ∗ , and therefore the residual time 𝑌 is exponential
with parameter 𝜆. The elapsed time 𝑋𝑒 is also exponential
with parameter 𝜆, because
𝐏 𝑋𝑒 > 𝜏 = 𝐏 no arrival in 𝑡 ∗ − 𝜏, 𝑡 ∗ = 𝐏 0, 𝜏 .
• We therefore show that the selected interval 𝑋𝑠 is the sum of
two independent exponential rvs with parameter 𝜆, i. e.,
Erlang of order 2, with mean 2/𝜆.
• The key issue is that an observer who arrives at an arbitrary
time is more likely to fall in a large rather than a small
interarrival interval. Thus, the expected length seen by the
observer is not 1/𝜆 but a higher value, 2/𝜆.
22
Random Incidence Paradox (3)
• Let us derive the density 𝑓𝑠 𝑥 of the selected interval 𝑋𝑠 . Our
basic observation is that longer intervals between arrival points
occupy larger segment of time axis than do shorter intervals,
and therefore it is more likely that 𝑡 ∗ will fall in a long interval.
• Thus, for the selected interval 𝑋𝑠 , we have
𝑓𝑋𝑠 𝑥 𝑑𝑥 = 𝐾𝑥𝑓 𝑥 𝑑𝑥
where 𝑓(𝑥) is the pdf of common interarrival times {𝑋𝑛 }, the
left-hand side is 𝐏{𝑥 < 𝑋𝑠 ≤ 𝑥 + 𝑑𝑥} and the right-hand side is
the linear expression about interval length 𝑥 and the relative
occurrence of such intervals 𝐏 𝑥 < 𝑋𝑛 ≤ 𝑥 + 𝑑𝑥 = 𝑓 𝑥 𝑑𝑥.
Integrating both sides, we find 𝐾 = 1/𝑚 with 𝑚 = 𝐄(𝑋𝑛 ).
• Thus,𝑓𝑋𝑠 𝑥 is given in terms of the common density 𝑓 𝑥 of
interarrival times by
𝑥𝑓(𝑥)
𝑓𝑋𝑠 𝑥 = .
𝑚 23
Random Incidence Paradox (4)
• Let us proceed to find the density 𝑓𝑌 𝑦 of residual time 𝑌. If we
are told that 𝑋𝑠 = 𝑥, then the conditional CDF is given by
𝑦
𝐹𝑌|𝑋𝑠 =𝑥 𝑦 𝑥 = , 0 ≤ 𝑦 ≤ 𝑥.
𝑥
Then, for 0 ≤ 𝑦 ≤ 𝑥,
1
𝑓𝑌|𝑋𝑠 =𝑥 𝑦|𝑥 = ,
𝑥
𝑓 𝑥
𝑓𝑋𝑠 ,𝑌 𝑥, 𝑦 = 𝑓𝑋𝑠 𝑥 ∙ 𝑓𝑌|𝑋𝑠 =𝑥 𝑦|𝑥 = .
𝑚
• Integrating 𝑓𝑋𝑠 ,𝑌 𝑥, 𝑦 over 𝑥 we obtain 𝑓𝑌 𝑦 , namely

𝑓 𝑥 𝑑𝑥
𝑓𝑌 𝑦 = න .
𝑥=𝑦 𝑚
This immediately gives the final result:
1 − 𝐹(𝑦)
𝑓𝑌 𝑦 = .
𝑚
• It gives the density of residual time in terms of the common
distribution 𝐹(∙) of interarrival times and its mean 𝑚. 24
Random Incidence of Non-Poisson Arrival
• Assume that buses arrive at a station deterministically, on the
hour, and 5 minutes after the hour. Then the interarrival times
alternate between 5 and 55 minutes. The average interarrival
time is 30 minutes.
• A person shows up at the bus station at a random time.
Question: What is the expected time until the next bus arrival?

• Answer: Such a person falls into an interarrival interval of


length 5 with probability 1/12, and an interarrival interval of
length 55 with probability 11/12. The expected length is
1 11
5∙ + 55 ∙ = 50.83,
12 12
which is considerably larger than 30, the average interarrival
time.
25
Merging of Poisson Processes

• Sum of independent Poisson random variables is Poisson.


• Sum of independent Poisson processes is Poisson.
• What is the probability that the next arrival comes from the
first process?
𝜆1 𝛿 𝜆1
=
𝜆1 𝛿 + 𝜆2 𝛿 𝜆1 + 𝜆2

26
Splitting of Poisson Processes

• Each message is routed along the first stream with


probability 𝑝, and along the second stream with
probability 1 − 𝑝.

− Routing of different messages are independent.


− Each output stream is Poisson.

27
Example: Email Filter (1)

• You have incoming email from two sources: valid email, and
spam. We assume both to be Poisson.
• You receive, on average, two valid emails per hour, and one
spam email every 5 hours: 𝜆1 = 2, 𝜆2 = 0.2
• Total incoming email rate 𝜆 = 𝜆1 + 𝜆2 = 2.2 emails per
hour
• 𝐏{a received email is spam} = 𝜆2ൗ𝜆1 +𝜆2 = 0.2Τ2.2 ≈ 0.09

28
Example: Email Filter (2)

• You install a spam filter, that filters out spam email correctly
80% of the time, but also identifies a valid email as spam 5%
of the time: 𝑝 = 0.95, 𝑞 = 0.2

• Inbox email rate = 𝑝𝜆1 + 𝑞𝜆2 = 0.95 × 2 + 0.2 × 0.2 = 1.94

• Spam folder email rate = 2.2 − 1.94 = 0.26

29
Example: Email Filter (3)

• 𝐏{an email in the inbox is spam}


𝑞𝜆2 (0.2 × 0.2)ൗ
= ൗ(𝑝𝜆 + 𝑞𝜆 ) = 1.94 ≈ 0.02
1 2

• 𝐏{an email in the spam folder is valid}


= (1−𝑝)𝜆1ൗ((1−𝑝)𝜆1 +(1−𝑞)𝜆2 ) = (0.05×2)ൗ0.26 ≈ 0.38

• Every how often should you check your spam folder, to find
one valid email, on average?
1
𝐄 𝑁 = 𝜆1 1 − 𝑝 𝜏 = 1 ⇒ 𝜏 = 0.05∙2 = 10 hours
30
Homework #7
Textbook “Introduction to Probability”, 2nd Edition, D. Bertsekas and J. Tsitsiklis
Chapter6 p.326-p.338, Problems 2, 3, 9, 10, 11, 12, 16
Due date: 아주BB 과제출제 확인

31

You might also like