Stochastic Process: X T X X
Stochastic Process: X T X X
Random Variable:
Any variable of a simple space of a random experiment is called random variable.
Stochastic Process:
A stochastic process is a family of random variables that describe the evolution through time of a process. It is
denoted by X t ;t T or n ; n 0 or
n0 .
X X n
Explanation:
Let the stochastic process is X t ;t T , then for each value of X t is a random variable. Here t is often
t,
termed as time and
X t as state of the process.
Example:
i) Suppose X t be a random variable representing the number of motor cars passing Prantik gate of JU during
a particular time t then X t ; t T is a family of random variables and hence is a stochastic process.
ii) Consider a simple experiment like throwing a true die. Suppose that X n is the outcome of the nth throw,
n 1 . Then X n ; n 0 is a family of random variables such that for a distinct value of n 1, 2, .
One
gets a distinct random
variable X n ; X n ; n 1 constitutes a stochastic process.
iii) Consider a random event occurring in time, such as, number of telephone calls received at a switch board.
Suppose that
X t is the random variable which represents the number of incoming calls in an interval 0, t
of duration t units. The number of calls within a fixed interval of specified duration, say, one unit of time, is
a random variable
X 1 and the family X t ;t T constitutes a stochastic process T 0, .
State Space:
The set of possible values of a single random variable
X t of a stochastic process X t ; t T is known as its
space. Again we can say, in the stochastic process
state
possible values of
X t is continuous.
X t ; t T is a discrete time process.
line
process known as continuous time process. Example: Suppose
X t represents the number of students waiting for
the bus at any time of day i.e., T 0, t , the stochastic process X t ; t T is a continuous time process.
Stochastic process with discrete state space & continuous time space
Stochastic process with continuous state space & discrete time space
X t is discrete and index set T is discrete then the process is known as discrete state space & discrete time
space.
Example:
Consumer preferences observed on a monthly basis.
The number of defective items in an acceptance sampling scheme. The number of items inspected is the
indexing parameter.
2. Stochastic Process with Discrete State Space & Continuous Time Space: In stochastic process X t ; t T
if the index set T be an interval on a real line T 0, t and the state space X t is discrete then the process
is
known as stochastic process with discrete state space & continuous time space. Example: Number of students
waiting for a bus in the Prantik gate in 0, t , the process is discrete state space and continuous time space.
3. Stochastic Process with Continuous State Space & Discrete Time Space: In a stochastic process
X t ; t T , if the set of possible values X t is an interval on a real line 0, and the index set, T is
of
discrete then the process is known as continuous state space and discrete time space. Example: Suppose that
4. Stochastic Process with Continuous State & Time Space: In a stochastic process X t ; t T , if the set of
possible values of
X t is continuous and index set T be an interval on a real line T 0, t then the process
is
known as continuous state & time space. Example: Suppose that
X t represents the maximum temperature at
a particular place in 0, t .
Markov Process:
If X t ; t T is a stochastic process such that, given the X s , the values X t , t s , do not depend on
value of
the values of
X u , u s , then the process is said to be a Markov process. It can be written as, if for
t1 t2 tn t
Markov Chain:
The stochastic process X n , n 0 is called a Markov chain, if j, k, j1 , ... jn1 N / I
for
Pr X n k | X n1 j, X n2 j1
... X 0 j n1 Pr X n k | X n1 j P jk
,
whenever the first member is defined.
Pij
probability refers to the probability that the process is in stat i and will be in state j in the next step. Here the
(i) Pij 0
an
d
(ii)
j
Pij 1 .
m
(i) P 0 an
j
k d
(ii) j Pjk m 1.
If X n , n 0 is a Markov chain with transition probability then transition probability matrix is given by
P jk
P1
P12 P1n
1 P2n
P2 P22
P 1
; i, j 0, 1, 2, , n
Pij
P P P
n1 n2 nn
Ex ample:
where , Pjk 0
n
and k
Pjk 1
1
for all j
2. px n 1, x 0 0
4. p0 , 0 ,1,1 p x 0 0, x1 0, x 2 1, x 3 1
Solution:
1.
P 2 P P
3
4 1 3
24 0 4 1 0
1
4 1 1 4 1
4 4 4 4
2
3 1
0 4 0 4
3 4
1 4
4
5 5 1
5 8 116 316
16 2 16
3 9
1
16 16 4
5
p 01
2 1 | 0
16
P x 2 x0
1
2
p 02 2 | 0
x0 16
2. We know, Px 2
3. We know,
P0 ,1,1 P x 0 0, x1 1, x 2 1
Px1 1| x 0 0Px 2 1| x1 1Px 0 0
5 1 1 5
16 4 3 192
4. We know,
p0 , 0 ,1,1 p x 0 0, x1 0, x 2 1, x 3 1
Px1 0 | x 0 0Px 2 1| x1 0Px 3 1| x 2 1Px 0 0
5 1 1 1 5
16 4 4 3 768
i) Accessible State:
n
State j of a Markov chain is said to be accessible from state i if Pi 0 for some n 1 . That is, state j
j
is accessible from state i if and only if, starting in i , it is possible that the process will ever enter state j . The
relation is denoted by i j .
ii) Non-Accessible:
n
If for all n , Pi 0 , then j is not accessible from i and it is denoted by i
j
j . That is, a process started in i can never enter state j .