Lecture On Stochastic Processes
Lecture On Stochastic Processes
net/publication/323572616
CITATIONS READS
2 5,529
5 authors, including:
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
now I'm working in projection of social media messages using branching process View project
All content following this page was uploaded by Hussein Yousif Eledum on 06 March 2018.
Lecture Notes on
Stochastic Processes
STAT 321
2018
Table of Contents
Page #
6. Solutions (1) 20
Markov Chains
Poisson Processes
Branching Processes
Random variable
A random variable is a real valued function whose numerical value is determined by the
outcome of a random experiment. In other words the random variable X is a function that
associated each element in the sample space Ω from with the real numbers (i.e. X : Ω )
Notation:
(capital letter): denotes the random variable .
(small letter): denotes a value of the random variable X.
Discrete random variable
A random variable X is called a discrete random variable if its set of possible values is
countable (integer).
Continuous random variable
A random variable X is called a continuous random variable if it can take values on a
continuous scales.
Discrete probability distribution
If is a discrete random variable with distinct values , then the function
( )
( ) {
1. ( ) 2. ∫ ( ) 3. ( ) ∫ ( ) ,
Linear property:
Let X be a random variable with the pdf f(x), and let a and b are a constants, then:
( ) ( )
The moments:
Let X be a random variable with the pdf f(x), the rth moment about the origin of X, is
given by:
∑ ( )
́ ( ) ,
∫ ( )
if the expectation exists
As special case:
́
Let X be a random variable with the pdf f(x), the rth central moment of X about , is
defined as:
∑ ( ) ( )
( ) ,
∫ ( ) ( )
As special case:
( ) the variance of X.
Moment- Generating Function MGF:
Let X be a random variable with the pdf f(x), the moment - generating function of X, is
given by ( ) and is denoted by ( ) . Hence :
∑ ( )
( ) ( ) ,
∫ ( )
Moment-generating functions will exist only if the sum or integral of the above definition
converges. If a moment-generating function of a random variable X does exist, it can be
used to generate all the moments of that variable.
Definition:
Let X be a random variable with moment - generating function of X, is given by ( ).
then :
( ) ( )
| ́ Therefore, | ́
( )
| ́ ́ ́
Example. Find the moment-generating function of the binomial random variable X and
then use it to verify that and .
. /
( ) ,
( ) ∑ . / =∑ . /( )
( )
now: ( )
( )
and: , ( )( ) ( ) -
Therefore, ́
́ ́ , ( ) - ( )
( ) ( ) ∑ ( )
Example. Let X have a binomial distribution function such that ( ). The PGF is
given by
( ) ( ) ∑ . /( ) ( )
( ) ∑ ( ) ( )
, ( )- ∑ ( ) ( ) ( )
- ( ) ∫ ( )
∑ ∑( )( ) ( )
( )( )
∫ ∫( )( ) ( )
{
The alternative and preferred formula for is:
( )
Linear combination
Let X and Y be a random variables with joint probability distribution ( ), and are
constants, then
( ) ( ) ( ) ( )
If X and Y are independent random variables, then
( ) ( ) ( )
Correlation coefficient
Let X and Y be two random variables with covariance and standard deviations and
respectively. The correlation coefficient of X and Y is
Definition
A stochastic process (random process) is a family of random variables, * ( ) + or
* + That is, for each t in the index set T, ( ) is a random variable.
Random process also defined as a random variable which a function of time t, that means ,
( ) is a random variable for every time instant t or it’s a random variable indexed by
time.
We know that a random variable is a function defined on the sample space . Thus a
random process * ( ) + is a real function of two arguments * ( ) +.
For fixed ( ), ( ) ( ) is a random variable denoted by ( ), as varies
over the sample space . On the other hand for fixed sample space , ( )
( ) is a single function of time , called a sample function or a realization of the process.
The totality of all sample functions is called an ensemble.
If both and are fixed, ( ) is a real number. We used the notation ( ) to
represent ( ).
Description of a Random Process
In a random process * ( ) + the index t called the time-parameter (or simply the
time) and T called the parameter set of the random process. Each ( ) takes values
in some set S called the state space; then ( ) is the state of the process at time t, and
if ( ) we said the process in state at time .
Definition:-
* ( ) + is a discrete - time (discrete parameter) process if the index set T of the
random process is discrete . A discrete-parameter process is also called a random sequence
and is denoted by * ( ) + or * +.
In practical this generally means T={1,2,3,…….}.
Thus a discrete-time process is * ( ) ( ) ( ) +: a new random number recorded at
every time 0, 1, 2, 3, . . .
Definition:-
* ( ) + is continuous - time (continuous parameter) process if the index set T is
continuous .
In practical this generally means T = [0, ∞), or T = [0,K] for some K.
Example 4: the number of occupied channels in a telephone link at the arrival time of the
nth customer, n = 1,2,...
Continuous-time, discrete-state processes
Example 6: The number of occupied channels in a telephone link at time t > 0
Example 7: The number of packets in the buffer of a statistical multiplexer at time t > 0
( ) ( ( ) ( ) )
Mean and Variance functions of random process:
As in the case of r.v.'s, random processes are often described by using statistical averages.
For the random process * ( ) }, the mean function ( ) is defined as
( ) , ( )- ∫ ( )
The above definition is valid for both continuous-time and discrete-time random processes.
In particular, if * ( ) + is a discrete-time random process, then
( ) , ( )-
The mean function gives us an idea about how the random process behaves on average as
time evolves (a function of time). For example, if ( ) is the temperature in a certain city,
the mean function ( ) might look like the function shown in Figure below. As we see,
the expected value of ( ) is lowest in the winter and highest in summer.
( ) , ( ) ( )- ∫ ∫ ( )
∫ ( ( ) ) ∫ ( ( ) )
( ( ) )| * ( ( ) ) ( ( ) )+
* +
( )
ii. Autocorrelation function of ( )
( ) * ( ) ( )+
Let and time shift
( ) *, ( () )-, ( ( ) )-+
*, ( () )-, ( ( ) )-+
( ) ( ) , ( ) ( )-
Let ( ) ( ) then
( ) ( )
( ) { ( ( ) ) ( ( ))}
( * ( ( ) )+ { ( ( ))})
( ) ( ( ))
Example:
| |
A random process * ( ) } with ( ) and ( )
Determine the mean, the variance and the covariance of the random variables ( )
and ( )
Solution:
( ) , ( )- ( ) ( ) , ( )- ( )
( ) *, ( )- + * , ( )-+
since ( ) *, ( )- +
( ) ( ) * ( )+ ( )
| |
Similarly,
( ) ( ) * ( )+ ( )
| |
, ( ) ( )- ( ) ( ) ( ) ( )
( ) ( ) ( ) ( ) ( )
Since, ( ) | |
( )
( ) ( )
This can be written as
( ) ( ) ( )( ) ( ) ( ) ( )( )
B. Wide-Sense Stationary Processes :
A random process is called weak-sense stationary or wide-sense stationary (WSS) if its
mean function and its autocorrelation function do not change by shifts in time. More
precisely, ( ) is WSS if, for all ,
1. , ( )- , ( )- constant (stationary mean in time)
For and time shift
2. ( ) , ( ) ( )- ( )
Note that the first condition states that the mean function ( ) is not a function of time ,
thus we can write ( ) . The second condition states that the correlation
function ( ) is only a function of time shift and not on specific times .
Definition
A continuous-time random process * ( ) } weak-sense stationary or wide-sense
stationary (WSS) if
1. ( )
2. ( ) ( ) ( )
Definition
A discrete-time random process * ( ) } weak-sense stationary or wide-sense
stationary (WSS) if
1. ( )
2. ( ) ( )
Note that a strict-sense stationary process is also a WSS process, but in general, the
converse is not true.
Example: wireless signal model
Consider RP ( ) ( ( ) )
Where : amplitude (capacity) ( ): carrier frequency : phase
( ) that is ( )
Solution
The Mean function of ( ) is ( ) constant
∫ ( ( ) ) ( ) ∫ ( ( ) )
( ( ) )|
, ( ( ) ) ( ( ) )-
( ) ( ) . / . /
Let ( ) ( ) ( )
( ( ) ) ( ( ) ) ( ( )) ( )
, ( ( ) ) ( ( ) )- ( ( )) ( )
Since ( ) therefore
( ( )) ( )
( )
ii. Correlation function:
( ) * ( ) ( )+ * ( ( ) ) ( ( ) )+
Because Independent
* ( ( ) ) ( ( ) )+ * + * ( ( ) ) ( ( ) )+
( ) ( ) , ( ) ( )-
Let ( ) ( ) then
( ) ( )
Therefore
* + * ( ( ) ) ( ( ) )+
* + [ ( ( )) ( ( ) )]
* +[ { ( ( ))} * ( ( ) )+]
( ) * + ( ( )) ( ( ))
( ) is WSS random process because the mean function is a constant (=0) and the
autocorrelation function is only a function of a time difference .
A. Independent Processes:
In a random process ( ), if ( ) for are independent r.v.'s, so that for
( ) ∏ ( )
and ( ) ∏ ( )
Or ( ( ) ( ) ( ) )
( ( ) ) ( ( ) ) ( ( ) )
then we call ( ) an independent random process. Thus, a first-order distribution is
sufficient to characterize an independent random process ( )
( ) ∏ ( )
( ) ∏ ( )
Example.
Consider the random process * + in which are iid standard normal
random variables.
(a) Write down ( ) for
(b) Write down ( ) for
Solution.
(a) Since ( ), we have
( )
√
√ √
( )
Problem 1
Let be a sequence of iid random variables with mean , - and , -
Define the discrete time random process * + as
(a) ( ).
(b) ( ).
(c) ( )
Solutions (1)
Problem1 (Solution)
(a) , -
, -
, - , - , -
(b) Let
( ) , -
,( )( )-
, - , - , -
[ ] , - [ ] , - , -
( ) , - , - , -
Similarly for
( ) , -
,( )( )-
, - , - , -
, - , - , -
Problem2 (Solution)
(a) Let so, ( ).
, -
, -
, -
,( ) ( ) - * +
(b) ( ) , -
, -
, -
, -
,( ) ( ) - * +
,( ) ( ) -
,( ) ( ) -,( ) ( ) -
( )( )
Problem 3 (Solution)
We need to check two conditions
1. ( ) and
2. ( ) ( )
We have
, -
, ( )-
∫ ( )
∫ ( )
Let then
( ) , ( ) ( )-
( ) , ( )-
( )
( )
( ) ( ) * ( )+
( )
| |
(c) ( ) , ( ) ( )-
( ) ( ) ( ) ( )
We know that ( ) ( ) . By using this,
( ) ( ( ) ( )) ( ( ) ( )). Therefore,
( ) ( ) ( ( ) ( ) ( ))
Since, ( ) | |
and ( ) ( )
Therefore, ( ) ( )
Basic Definitions
Let * + be a stochastic process taking values in a state space S that has N states. To
understand the behavior of this process we will need to calculate probabilities like:
* + ( )
This can be computed by multiplying conditional probabilities as follows:
{ ) ( ) ( )
( )
Example A .
We randomly select playing cards from an ordinary deck.
The state space is * +. Let's calculate the chance of observing the sequence
using two different sampling methods.
(a) Without replacement:
{ }
={ * ) ( ) ( )
Definition.
The process * + is called a Markov chain if for any n and any collection of states
we have:
( ) ( )
For a Markov chain, the future depends only on the current state and not on history.
Exercise.
In example A, calculate ( ) and confirm that only "with replacement" do
we get a Markov chain.
[ ] [ ] where,
[ ]
Figure below shows the state transition diagram for this Markov Chain. In this diagram
there are three possible states 1,2 and 3, and the arrows from each state to other states show
the transition probabilities .When there is no arrow from stat to state , it means that
Example
Consider the Markov chain shown in Figure above. Find
(a) ( )
(b) ( )
(c) If we know ( ) , find ( )
(d) If we know ( ) , find ( )
Solution
(a) By definition ( )
(b) By definition ( )
(c) ( ) ( ) ( )
( ) ( )( )
(d) ( )
( ) ( ) ( )
( ) ( ) ( ) By Markov properties
( ) ( )( )( )
Example.
A man either drives his car or takes a train to work each day. Suppose he never takes the
train two days in a row, but if he drives to work, then the next day he is just as likely to
drive again as he is to take the train.
The state space of the system is * ( ) ( )+ . This stochastic process is a Markov
chain since the outcome on any day depends only on what happened the preceding day.
The transition matrix of the Markov chain is
t d
t 0 1
1 1
d 2 2
The first row of the matrix corresponds to the fact that he never takes the train two days in
a row and so he definitely will drive the day after he takes the train. The second row of the
matrix corresponds to the fact that the day after he drives he will drive or take the train
with equal probability.
Example.
Card colour with replacement.
B R
B 12 1
1 2
2
1
R 2
Example.
Three boys A, B and C are throwing a ball to each other. A always throws the ball to B and
B always throws the ball to C; but C is just as likely to throw the ball to B as to A. let
denote the nth person to be thrown the ball. The state space of the system is {A,B,C}. This is
a Markov chain since the person throwing the ball is not influenced by those who
previously had the ball. The transition matrix of the Markov chain is
A B C
A 0 1 0
B 0 0 1
1 1 0
C 2 2
The first row of the matrix corresponds to the fact that A always throws the ball to B. The
second row of the matrix corresponds to the fact that B always throws the ball to C. the last
row corresponds to the fact that C throws the ball to A or B with equal probability(and
does not throw it to himself )
We can find this probability by applying the law of total probability. In particular we argue
that can take one of the possible values in . Thus we can write
( )
( ) ∑ ( ) ( )
∑ ( ) ( ) (by Markov properties)
∑
We conclude
( )
( ) ∑
That means, In order to get to state , we need to pass through some intermediate state .
( ) ( ) ( )
( )
( ) ( ) ( )
[ ]
Thus, we conclude that the two-step transition matrix can be obtained by squaring the state
transition matrix, i.e.,
( )
( )
Similarly,
( )
Generally, we can define the n-step transition probabilities as
( )
( )
That means, In order to get to state , we need to pass through some intermediate
states .
( )
and the n-step transition matrix, , as
( ) ( ) ( )
( ) ( ) ( )
( )
( ) ( ) ( )
[ ]
Similar to the case of two-step transition probabilities, we can show that
( )
More generally, let and be two positive integers and assume . In order to get to
state in ( ) steps, the chain will be at some intermediate state after steps. To
( )
obtain , we sum over all possible intermediate states:
( ) ( ) ( )
( ) ∑
The above equation is called the Chapman-Kolmogorov equation.
The probability distribution * +
Consider a Markov chain * + where * + Suppose that we
know the probability distribution of . More specifically, define the row vector as
( )
, ( ) ( ) ( )-
How can we obtain the probability distribution of ? We can use the law of
total probability. More specifically, for any , we can write
( ) ∑ ( ) ( )
∑ ( )
If we generally define
( )
, ( ) ( ) ( )-
we can rewrite the above result in the form of matrix multiplication
( ) ( )
( ) ( )
Example.
Consider the Markov chain for the example of the man who either drives his car or takes a
train to work.
t d
t 0 1
1 1
d 2 2
Here t is the state of taking a train to work and d of driving to work
⁄ ⁄ ⁄ ⁄ ⁄ ⁄
⁄ ⁄ ⁄ ⁄ ⁄ ⁄
[ ][ ] [ ]
Thus the probability that the process changes from, say, state t to state d in exactly 4 steps
( ) ( ) ( ) ( )
is , i.e. . similarly, , and .
Suppose that on the first day of work, the man toss affair die and drove to work if only if a
( ) , - is the initial probability distribution
6 is appeared. In other words,
then,
⁄ ⁄
( ) ( )
, - , -
⁄ ⁄
[ ]
( ) ( )
Is the probability distribution after 4 days, i.e. and .
Example.
Consider a system that can be in one of two possible states, * +. In particular,
suppose that the transition matrix is given by
⁄ ⁄
⁄ ⁄
[ ]
Suppose that the system is in state 0 at time , i.e., . Find the probability that
the system is in state 1 at time 3.
Solution:
( )
Here we know , ( ) ( )- , -
Thus, the probability that the system is in state 1 at time 3 is ⁄ .
⁄ ⁄
( ) ( )
, - [ ⁄ ⁄ ]
⁄ ⁄
[ ]
Thus, the probability that the system is in state 1 at time 3 is .
Example.
Consider the Markov chain for the example of the three boys A, B and C who are throwing
a ball to each other .
A B C
A 0 1 0
B 0 0 1
C 12 12 0
( ) , - is the initial probability
Suppose C was the first person with ball, i.e.
distribution then,
( ) ( )
, -[ ] [ ⁄ ⁄ ]
( ) ( )
[ ⁄ ⁄ ][ ] [ ⁄ ⁄ ]
( ) ( )
[ ⁄ ⁄ ][ ] [ ⁄ ⁄ ⁄ ]
Thus, after 3 throws, the probability that A has the ball is 1/4 , that B has the ball is 1/4 and
( ) ( ) ( )
that C has a ball is 1/2 . , and
Example. A school contains 200 boys and 150 girls. One student is selected after another
to take an eye examination.
Explain whether this process is a Markov Chain or not and why?
Solution
The state space of the stochastic process is * ( ) ( )+ . However, this
process is not a Markov chain since, for example the probability that the third person is a
girl depends not only on the outcome of the second trial but on both the first and second
trials.
Stationary Distribution
Let P be the transition probability matrix of a Markov chain * +. If there exists a
probability vector ̂ such that:
̂ ̂ ( )
then ̂ is called a stationary distribution for the Markov chain.
Example.
Find stationary distribution ̂ for the transition matrix P of a Markov chain:
[ ⁄ ⁄ ]
, -[ ⁄ ⁄ ] , -
, -=, - or , or
Thus ̂ , - 0 1.
Thus in the long run, the man will take the train to work of the time, and drive to work
Example
Find stationary distribution ̂ for the transition matrix P of a Markov chain:
[ ]
̂ 0 1
Thus in the long run, A will be thrown the ball 20% of the time, and B and C 40% of the
time.
31 Lesson 7: Stationary Distribution and Regular Markov chain Dr. Hussein Eledum
University of Tabuk – Faculty of Science – Dept. of Statistics - Stochastic Processes STAT 321 1438-39
[ ⁄ ⁄ ]
[ ⁄ ⁄ ]
* +, * +, * +
Where ̂ is a matrix whose rows are identical and equal to the stationary distribution ̂ for
the Markov chain defined by Eq. (1). In other words, approaches ̂ means that each
entry of approaches the corresponding entry of ̂ , and ̂ approaches ̂ means that
each component of ̂ approaches the corresponding responding component of ̂.
32 Lesson 7: Stationary Distribution and Regular Markov chain Dr. Hussein Eledum
University of Tabuk – Faculty of Science – Dept. of Statistics - Stochastic Processes STAT 321 1438-39
Example. For the regular transition matrix P of a Markov chain below find the matrix ̂ :
[ ⁄ ⁄ ]
[ ] 0 1 [ ] 0 1
[ ] 0 1 [ ] 0 1
Theorem.
If a stochastic matrix has 1 on the main diagonal, the is not regular (unless is )
Calculating probability
The probabilities for a Markov chain are computed using the initial probabilities
( )
( ) and the transition probabilities
( )
( )
Example.
Consider a Markov chain with the following transition matrix:
0 1
0 3 1
P 14 4
5
16 6
1. To find the probability that the process follows a certain path, you multiply the
initial probability with conditional probabilities. For example, what is the chance
that the process begins with 01010?
( )
( )
( ) ( )
33 Lesson 7: Stationary Distribution and Regular Markov chain Dr. Hussein Eledum
University of Tabuk – Faculty of Science – Dept. of Statistics - Stochastic Processes STAT 321 1438-39
( )
( )
Example.
( ) , - and
If we start in state zero, then
( ) ( )
, -[ ]
[ ]
, -
On the other hand, if we flip a coin to choose the starting position then,
( )
0 1 and
( ) ( )
[ ][ ]
[ ]
, -
34 Lesson 7: Stationary Distribution and Regular Markov chain Dr. Hussein Eledum
University of Tabuk – Faculty of Science – Dept. of Statistics - Stochastic Processes STAT 321 1438-39
Accessible States:
( )
State is said to be accessible from state if , written as .
Communicative states
Two states and are called communicative states written as , if they are accessible
from each other. In other words means and
Irreducible Markov Chain
A Markov chain is said to be irreducible if all states communicate with each other.
Communication is an equivalence relation. That means that
1. Every state communicates with itself
2. implies and
3. and together imply
- If the transition matrix is not irreducible, then it is not regular
- If the transition matrix is irreducible and at least one entry of the main diagonal is
nonzero, then it is regular.
Class structure
The accessibility relation divides states into classes. Within each class, all states
communicate to each other, but no pair of states in different classes communicates. The
chain is irreducible if there is only one class.
Example:
Consider a Markov Chain shown in figure below
2 3 5
1 4
Any state 1, 2, 3, 4 is accessible from any of the five states, but 5 is not accessible from 1,
2, 3, 4. So we have two classes: {1, 2, 3, 4}, and {5}. The chain is not irreducible.
Example
Consider the chain on states 1, 2, 3, determine whether it is irreducible or not?
[ ]
Solution
There is only one class because 1 ↔ 2 and 2 ↔ 3, this is an irreducible Markov chain.
Other solution is that
2 3
[ ]
1
( )
Since all , then this chain is irreducible.
Example
Consider the chain on states 1, 2, 3, 4, and
2 3
1 4
[ ]
This chain has three classes {1, 2}, {3} and {4} and is hence not irreducible.
( )
and also some .
Example
Consider the Markov chain shown in Figure below. It is assumed that when there is an
arrow from state to state , then . Find the classes for this Markov chain.
Solution
There are 4 communicating classes in this Markov Chain. States 1, 2 and states 3, 4
communicate with each other, but they do not communicate with any other notes. State 5
does not communicate with any other states, so it by itself is a class. States 6,7 and 8
construct another class. Thus, the class are
[ ] [ ]
Note that the elements of are the one-step transition probabilities from non-absorbing to
absorbing states, and the elements of are the one-step transition probabilities among the
non-absorbing states.
Example.
Suppose the following matrix is a transition matrix of a Markov chain.
a1 a2 a3 a4 a5
a1 1
0 1 1 1
The state and are each absorbing, since each 4 4 4 4
a2 0 1 0 0 0
of the second and fifth rows has a 1 on the main a3 12 0 14 1
0
4
a4 0 1 0 0 0
diagonal.
a5 0 0 0 0
1
Each row, except the last, corresponds to the fact that a string of heads is either broken if a
tail occurs or is extended by one if a head occurs. The last line corresponds to the fact that
the game ends if 3 heads are tossed in a row. Note that is an absorbing state.
Recurrent and Transient States
A state is said to be recurrent if, any time that we leave that state, we will return to that
state in the future with probability one. On the other hand, if the probability of returning is
less than one, the state is called transient.
Definition.
State is transient state if there exists state that is reachable from , but the state is not
reachable from state j. If state is not transient, it is recurrent state.
2 is transient 3 is transient
Class=* + is closed and irreducible are recurrent
Example.
Consider the chain on states 1, 2, 3, 4, and
[ ]
Example
Consider a Markov chain with the following state diagram
Not ergodic
Example
Consider a Markov chain with the following state diagram
Is ergodic because all state are recurrent and communicate with other and it is aperiodic
because (k=0) there is no period every state can communicate with other state.
Problem 5:
Consider a Markov Chain with three possible states * +, that has the following
transition matrix.
[ ]
(a) ( )
(b) 2. ( )
(c) ( )
Problem 8
Determine whether each of the following is stochastic matrix or not and why given that * +?
Problem 9
Given the transition matrix
( )
[ ] and 0 1.
Find:
( ) ( ) ( )
(a) (b) (c)
Problem 10
Given the transition matrix
( )
[ ] and 0 1.
Find:
( ) ( ) ( ) ( )
(a) (b) (c) (d)
Problem 11
A salesman's area consists of three cities, A, B and C. He never sells in the same city on
successive days. If he sells in city A, then the next day he sells in city B. however, if he
sells in either B or C, then the next day he is twice as likely to sell in city A as in the other
city. Find
(a) The transition matrix of this process.
(b) In the long run, how often does he sell in each of the cities.
Problem 12
There are 2 white balls in urn A and 3 red balls in urn B. at each step of the process a ball
is selected from each urn and the two balls selected are interchanged. Let the state ai of the
process be the number i of red balls in urn A. find:
(a) The transition matrix of this process.
(b) What is the probability that there are 2 red balls in urn A after 3 steps.
(c) In the long run, what is the probability that there are 2 red balls in urn A.
Problem 13
Consider the transition matrix P of a Markov chain of S={0,1} .
0 1
Given that ( ) ( ) .
(a) Find the distribution of
(b) Find the distribution of , when .
Problem 14
Consider a Markov chain of the transition matrix
0 1 with * +.
Compute,
(a) ( ) (b) ( )
(c) ( )
Problem 15
Determine whether each of the given matrices is recurrence or not
[ ]
Problem 16
Consider the two Markov chains shown in Figure below. Identify the transient and
recurrent states, and the irreducible closed classes in each one.
(a) (b)
Problem 17
Consider the Markov chains shown in Figure below. Identify the transient, recurrent,
periodic and aperiodic states.
Problem 18
Determine whether the following matrix is ergodic or not and Why?
Problem 19. Determine whether each of the following matrix is regular or not and why?
Solutions (2)
Problem 1: (Solution)
(a) The Bernoulli process { + is a discrete-parameter, discrete-state process. The
state space is S={0,1}, and the parameter set is N={1,2,3,…….}.
(b) A sample sequence of the Bernoulli process can be obtained by tossing a coin
consecutively. If a head appears, we assign 1, and if a tail appears, we assign 0.
n 1 2 3 4 5 6 7 8 9 10 ……
Coin tossing H T T H H H T H H T ……
1 0 0 1 1 1 0 1 1 0 ……
0 2 4 6 8 10 n
0 1
{ 0 p 1 p
P
1 1 p p
Problem 2: (Solution)
(a) The Binomial process { + is a discrete-parameter, discrete-state process. The state
space * +, and the parameter set is * +.
(b) Transition matrix
p, j i 1 (i 0,1,...)
PSn1 j | Sn i, Sn1 in1 ,..., S1 i1 pij 1 p, j i (i 0,1,...)
0,
otherwise
( )
The first row of the matrix corresponds to the fact that in the next trial still have 1 success that
means failed with probability , get 2 successes means succeeded with probability . Moving
from 1 to 3,4,…. That impossible.
Problem 3: (Solution)
(a) The simple random walk process { + is a discrete-parameter, discrete-state
process. The state space is S={…..,-2,-1,0,1,2,…..}, and the parameter set is
T={0,1,2,3,…….}.
(b) A sample sequence of the simple random walk process can be obtained by tossing a coin
every second and letting increase by unity if a head appears, and decrease by unity if a
tail appears. Thus, for instance,
n 0 1 2 3 4 5 6 7 8 9 10 …
Coin tossing H T T H H H T H H T …
0 1 0 -1 0 1 2 1 2 3 2 …
(d) ( ) {
( )
Suppose ( ) and ( ) 1/2
( ) { | |
Problem 4: (Solution)
(a) The transition matrix of this process
S T
The state space is S={S"study",T "Not study"} S 0.3 0.7
P
T 0.4 0.6
S T
(b) The transition matrix after 4 nights. S 0.36 0.64
P 4
T 0.36 0.64
(c) The probability that he didn't study in the second night
( )
( ) is the initial probability distribution then,
( ) ( )
0 10 1 , -
( )
The probability that he didn't study in the second night is .
Problem 5: (Solution)
(b) By definition ( )
(c) By definition ( )
(d) ( ) ( ) ( ) ( ) ( )( )
(e) ( ) ( ) ( ) ( )
( ) ( ) ( ) By Markov properties
( ) ( )( )( )
(f) ( ) ( ) ( ) ( )
( ) ( ) ( )
( ) ( ) ( )
( )
( )( )( )
Problem 6: (Solution)
The state space is S={R"Right",L "Left"} , then transition matrix of this process is,
R L
R 0.8 0.2 The probability distribution for the first trial ( ) , -.
P
L 0.6 0.4
(a) To compute the probability distribution for the next step, i.e. the second trial, multiply p by
the transition matrix P.
, -0 1 , -
Thus, on the second trial he predicts that 70% of the mice will go right and 30% will go
left.
(b) To compute the probability distribution for the third trial, multiply that of the second trial by
Thus, on the third trial he predicts that 74% of the mice will go right and 26% will go left.
(c) We assume that the probability distribution for the thousandth trial is essentially the
stationary probability distribution of the Markov chain, and we compute it by the following
, -0 1 ( ) , -
Thus, on the thousandth trial he predicts that 75% of the mice will go right and 25% will go left.
Problem 7: (Solution)
(a) ( )
( )
(b) ( )
Problem 9: (Solution)
( )
* + 0 1
( ) ( ) ( ) ( )
(a) 0 1* + 0 1 (b) (c)
(b) ̂ 0 1 , -
Thus in the long run he sell 40% of the time in city A, 45% of the time in city B, and
15% of the time in city C.
Problem 12: (Solution)
(a) There are 3 states describe by the following diagrams:
2W 3R 1W 1W 2R 2W
1R 2R 1R
A B A B A B
For example, if the process is in state then a white ball must be selected from urn A and
a red ball from urn B, so the process must move to state . Accordingly, the first row of
the transition matrix is , -. To move from to a red ball must be selected
, -0 1 ( ) , -
distribution of , when is , -
Problem 14: (Solution)
( ) ( ) ( )
(a) ( )
(b) ( )
(c) ( )
0 1
[ ]
0 1
(c) No, because it is not irreducible (not connectable). Also, if you multiply it by itself over
and over it will still contain zeros
( ) should satisfy:
1. ( ) and ( )
2. ( ) is integer valued that is, ( ) * + for all , )
3. If , then ( ) ( )
4. For , ( ) ( ) equals the number of events that have occurred on the
interval ( )
Since counting processes have been used to model arrivals (such as the supermarket
example above), we usually refer to the occurrence of each event as an "arrival". For
example, if ( )is the number of accidents in a city up to time , we still refer to each
accident as an arrival. Figure below shows a possible realization and the corresponding
sample function of a counting process ( ).
By the above definition, the only sources of randomness are the arrival times
Definition: (Independent Increment)
Let * ( ) , )+ be a continuous-time random process, we say that ( ) has
independent increment if, for all the random variables
( ) ( ) ( ) ( ) ( ) ( )
are independent.
Note that for a counting process, ( ) ( ) is the number of arrivals in the
interval ( -.Thus, a counting process has independent increments if the numbers of
arrivals in non-overlapping (disjoint) intervals
( -( - ( -.
are independent.
( ) {
- if ( ), then , - and , -
- if ( ), for and the are independent, then
( )
Definition (Poisson Process)
The counting process * ( ) , )+ is said to be a Poisson process having rate
(intensity) ( ) if:
1. ( )
2. ( ) has independent increments.
3. The number of arrivals in any interval of length has ( )
distribution with mean .
It follows from condition 3 that a Poisson process has stationary increments and that
, - and , -
We conclude that in a Poisson process, the distribution of the number of arrivals in any
interval depends only on the length of the interval and not on the exact location of the
interval on the real line.
Result. Let ( ) be the probability that exactly n events occur in an interval of length ,
namely,
( ) ( ( ) ). We have, for each , )
( )
( )
Example.
The number of customers arriving at a grocery store can be modelled by a Poisson process
with intensity λ=10 customers per hour.
1. Find the probability that there are 2 customers between 10:00 and 10:20.
2. Find the probability that there are 3 customers between 10:00 and 10:20
and 7 customers between 10:20 and 11.
Solution
1. Here λ=10 and the interval between 10:00 and 10:20 has length hours. Thus,
Therefore,
( )
( )
( ) ( )
( )
( )( )
Example.
Suppose the process * ( ) , )+ be a Poisson process having rate . Find
* ( ) ( ) ( ) +.
Solution:
We have, * ( ) ( ) ( ) ( ) ( ) +.
From independent increments properties we notice that the r.v.'s
( ) ( ) ( ) ( ) ( ) are independents, according to stationary
properties the r.v.'s follow Poisson distribution with the parameters
( ) ( ) respectively. Therefore,
* ( ) ( ) ( ) +
( ) ( ) ( )
(Taylor Series)
Note that if Δ is small, the terms that include second or higher powers of Δ are negligible
compared to Δ. We write this as
( ( ) ) ( )
( ( ) ) is the probability that no event occurs in the interval .
Where ( ) is a function of which goes to zero faster than does ; that is,
( )
The letter of Omicron ( ) was originally used in mathematics as a symbol for Big O notation,
representing the asymptotic rate of growth of a function.
Now, let us look at the probability of having one arrival in an interval of length Δ.
( )
( ( ) )
( ) (Taylor Series)
. /
( )
We conclude that
( ( ) ) ( )
Similarly,
( ( ) ) ( )
Definition
The counting process * ( ) , )+ is said to be a Poisson process having rate
(intensity) ( ) if:
1. ( )
2. ( ) has independent and stationary increments.
3. We have
( ( ) ) ( )
( ( ) ) ( )
( ( ) ) ( )
Distribution of Interarrival times
Exponential Distribution
It is often used to model the time elapsed between events. A continuous random
variable is said to have an exponential distribution with parameter , shown as
X ( ), if its probability density function is of the form
( ) {
An CDF is given as
( ) ( ) { implies that ( )
- if ( ), then , - and , -
We conclude
( ) ( )
Therefore, ( ). Let be the time elapsed between the first and the
second arrival.
Z1 Z2 Z3 Zn
0 T1 T2 T3 Tn-1 Tn t
Figure: The random variables are called the interarrival times of the counting
process ( ).
Let and . Note that the two intervals ( - and , - are independent. We
can write
( ) ( , - )
( , -) ( )
We conclude that ( ), and that and are independent. The random
variables are called the interarrival times of the counting process ( ).
Similarly, we can argue that all are independent and ( ) for
Example
Let ( ) be a Poisson process with intensity , and let be the corresponding
interarrival times.
1. Find the probability that the first arrival occurs after , i.e., ( )
2. Given that we have had no arrivals before , find ( ).
3. Given that the third arrival occurred at time , find the probability that the
fourth arrival occurs after .
Solution
1. ( ), we can write
( )
( )
another to solve this is to note that
( )
( ) ( ( -)
2. we can write
( ) ( ).. (memoryless property)
( )
another to solve this is to note that the number of arrivals in (1,3] is independent of the
arrivals before . Thus
( ) ( ( - ( -)
( )
( ( -)
3. the time between the third and the fourth arrival is ( ) we can write
( ) ( ) (independent of the Z's )
( )
( ) {
- if ( ), then , - and , -
- if we let we get ( )
- ( )
We know that If where denotes the time from the beginning
until the occurrence of the nth event. Thus, * + is called an arrival process. So
is the sum of independent ( ) random variables.
Theorem. If , where the 's are independent ( ) random
variables, then ( ).
The gamma distribution also called Erlang distribution, i.e, we can write
( ) ( )
The PDF of is given by
( ) { ( )
( ) ( )
Problem 1
Suppose we know that a receptionist receives an average of 15 phone calls per hour.
a) What is the probability that he will receive at least two calls between 8 and 8:12 am.
b) If the receptionist absents for 10 minutes what is the probability that no call has
been lost.
Problem 2
Consider the failures of a link in a communication network. Failures occur according to a
Poisson process with rate 2.4 per day. Find:
(i) Probability of time between failure greater than
(ii) Probability of time between failure less than
(iii) Probability of failures in
(iv) Probability of 0 failures in next day.
Problem 3
Damages occur in a connection wire under the ground follow Poisson process at rate of
per mile.
a) What is the probability that no damages in the first 2 miles.
b) In condition of no damages in the first 2 miles, what is the probability that no
damages between the 2nd and 3rd miles.
Problem 4
Suppose the process * + be a Poisson process having rate . Find:
1. * +.
2. * +.
Solutions (3)
Problem 1: (Solution)
(a) Suppose X is a random variable associated to the number of calls received between
8 and 8:12 am, hence, X is follows Poisson distribution with mean then,
* + ( * + * +)
( )
(b) Suppose Y is a random variable associated to the number of calls received within 10
minutes, hence, Y is follows Poisson distribution with mean then the
Problem 2: (Solution)
(i) ( )
(ii) ( )
( )
(iii) ( )
(iv) ( )
Problem 3: (Solution)
Suppose ( ) a number of damages occur until mile t . then,
a) The random variable ( ) follows Poisson distribution with rate parameter
,hence
* ( ) +
b) Since the two random variables ( ) ( ) and ( ) ( ) are independent,
therefore, the conditional probability and unconditional probability are
equivalent,then
* ( ) ( ) +
Problem 4: (Solution)
1. * + * +
equivalent to
, ( ) ( ) ( ) - , ( ) ( ) ( ) ( ) ( ) -
( ) ( ) ( )
( ) ( ) ( )
2. * +
* +
* +
( ) ( ) ( )
( ) ( ) ( )
Definition:
A branching process is defined as follows.
- Single individual at time .
- Every individual lives exactly one unit of time, then produces offspring and dies.
- The number of offspring takes values 0, 1, 2, . . . , and the probability of
producing offspring is ( ) .
- All individuals reproduce independently. Individuals have family sizes
, where each has the same distribution as .
- Let be the number of individuals born at time , for Interpret
as the size of generation .
- Then the branching process is * + * +.
Definition:
The state of the branching process at time is , where each can take values
Note that always. represents the size of the population at time .
Branching Process
( ) ( ( . ( ( ( ) ))/))
⏟
( ) ⏟ ( . ( ( ))/)
⏟
( ) ⏟ ( ) ( . ( ( ( ) ))/)
⏟
⏟( . ( ( ( ) ))/)
⏟
( ( . ( ( ( ) ))/))
⏟
Mean of
Theorem. * + be a branching process with (start with a single
individual). Let denote the family size distribution, and suppose that E( ) . Then
( )
Proof. We know that is a randomly stopped sum:
∑
( ) (∑ ) ( ) ( )
( ) ( ( ))
( )
( )
( ) ( )
Variance of
Theorem. * + be a branching process with (start with a single
individual). Let denote the family size distribution, and suppose that E( ) and
( ) Then
( ) ,
. /
( )
( )
Since is:
( )
( ) . / ( ) ( ) . /
Since is:
( )