0% found this document useful (0 votes)
681 views8 pages

Stochastic Process: X T X X

1. A stochastic process is a family of random variables that describe the evolution of a process over time. 2. Key aspects of a stochastic process include its index/parameter set (often time), state space (possible values of the random variables), and whether it has discrete or continuous time and state space. 3. A Markov process is a stochastic process where the future states depend only on the present state, not on the sequence of events that preceded it. A discrete-time Markov process is called a Markov chain.

Uploaded by

Shovon Roy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
681 views8 pages

Stochastic Process: X T X X

1. A stochastic process is a family of random variables that describe the evolution of a process over time. 2. Key aspects of a stochastic process include its index/parameter set (often time), state space (possible values of the random variables), and whether it has discrete or continuous time and state space. 3. A Markov process is a stochastic process where the future states depend only on the present state, not on the sequence of events that preceded it. A discrete-time Markov process is called a Markov chain.

Uploaded by

Shovon Roy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Stochastic Process

Random Variable:
Any variable of a simple space of a random experiment is called random variable.

Stochastic Process:
A stochastic process is a family of random variables that describe the evolution through time of a process. It is
denoted by X t ;t T  or n ; n  0 or 
n0 .
X X n
Explanation:

Let the stochastic process is X t ;t T  , then for each value of X t  is a random variable. Here t is often

t,
termed as time and
X t  as state of the process.

Example:
i) Suppose X t  be a random variable representing the number of motor cars passing Prantik gate of JU during

a particular time t then X t ; t T  is a family of random variables and hence is a stochastic process.

ii) Consider a simple experiment like throwing a true die. Suppose that X n is the outcome of the nth throw,
n  1 . Then X n ; n  0  is a family of random variables such that for a distinct value of n  1, 2,  .
One
gets a distinct random
variable X n ; X n ; n  1 constitutes a stochastic process.

iii) Consider a random event occurring in time, such as, number of telephone calls received at a switch board.
Suppose that
X t  is the random variable which represents the number of incoming calls in an interval 0, t 
of duration t units. The number of calls within a fixed interval of specified duration, say, one unit of time, is
a random variable
X 1 and the family X t ;t  T  constitutes a stochastic process T  0,   .

Specification of Stochastic Processes:

Index Set or Parameter Space:


The set T of a stochastic process X t ;t  T  is known as index set or parameter space of the process.

Explanation: In a stochastic process X t ; t T  , T is defined as index set or parameter space.

State Space:
The set of possible values of a single random variable
X t  of a stochastic process X t ; t T  is known as its
space. Again we can say, in the stochastic process
state

X t ; t T  ; state space is defined as the set of all


possible
values that the random
variable X t  can take. State space may be discrete or continuous-
Discrete State Space:
In a stochastic process X t ;t  T  , if the index set T is countable then the process is known as discrete state
space.
For example, if X n is the total number of sixes appearing in the first n throws of a die, the set of possible values of

X n is the finite set of non-negative integers


0, 1, 2, ..., n . Here, the state space of X n is discrete. Or number of
incoming call in a telephone exchange in 0, t  is a discrete state space.

Continuous State Space:


The state space of a stochastic process X t ; t
is called continuous state space if the set of possible values of
T  X t  is in a interval on a real line 0,   .

For example, suppose that


X t represents the maximum temperature at a particular place in 0, t  , then set
 of

possible values of
X t  is continuous.

Types of Stochastic Process Depending on Time:


There are two types of stochastic process depending on time
 Discrete time process
 Continuous time process

Discrete Time Process:


In a stochastic process X t ; t T  , if T is discrete then it is known as discrete time process. Example:
Suppose
X t represents the number of customers entering a super market by time t , the stochastic process


X t ; t T  is a discrete time process.

Continuous Time Process:


In a stochastic process
X t ; t T  , if the parameters space T be a interval on a real T  0, t  , then the

line
process known as continuous time process. Example: Suppose
X t represents the number of students waiting for


the bus at any time of day i.e., T  0, t  , the stochastic process X t ; t T  is a continuous time process.

Types of Stochastic Process Depending on Time and State:


Depending on state space & time space, stochastic process may be divided into four kinds
 Stochastic process with discrete state & time state

 Stochastic process with discrete state space & continuous time space
 Stochastic process with continuous state space & discrete time space

 Stochastic process with continuous state & time space


1. Stochastic Process with Discrete State & Time Space: In stochastic process X t ; t
if the state space
T 

X t  is discrete and index set T is discrete then the process is known as discrete state space & discrete time
space.

Example:
 Consumer preferences observed on a monthly basis.
 The number of defective items in an acceptance sampling scheme. The number of items inspected is the
indexing parameter.

2. Stochastic Process with Discrete State Space & Continuous Time Space: In stochastic process X t ; t T 
if the index set T be an interval on a real line T 0, t  and the state space X t  is discrete then the process

is
known as stochastic process with discrete state space & continuous time space. Example: Number of students
waiting for a bus in the Prantik gate in 0, t  , the process is discrete state space and continuous time space.

3. Stochastic Process with Continuous State Space & Discrete Time Space: In a stochastic process

X t ; t T  , if the set of possible values X t  is an interval on a real line 0, and the index set, T is

of 
discrete then the process is known as continuous state space and discrete time space. Example: Suppose that

X t  represents the average temperature at a particular place in every hour in a day.

4. Stochastic Process with Continuous State & Time Space: In a stochastic process X t ; t T  , if the set of
possible values of
X t  is continuous and index set T be an interval on a real line T 0, t  then the process

is
known as continuous state & time space. Example: Suppose that
X t  represents the maximum temperature at
a particular place in 0, t .

Markov Process:
If X t ; t T  is a stochastic process such that, given the X s  , the values X t  , t  s , do not depend on
value of
the values of
X u , u  s , then the process is said to be a Markov process. It can be written as, if for
t1  t2    tn  t

Pra  X t   b | X t1   x1 , , X t n   x n   Pra  X t   b | X t n   x n 


then the process X t ; t T  is a Markov process. A discrete parameter Markov process is known as a Markov
chain.

Markov Chain:
The stochastic process X n , n  0 is called a Markov chain, if j, k, j1 , ... jn1  N / I

for
Pr X n  k | X n1  j, X n2  j1
... X 0  j n1   Pr X n  k | X n1  j   P jk
,
whenever the first member is defined.

The outcomes are called the states of the Markov Chain; if


X n has the outcome
j X n  j  the process is said
i.e.,
to be at state j at nth trial. Pjk denote the transition probability.
Transition Probability:
The conditional probability
Pr X n1  j | X n  i   is known as transition probability. That is, the transition

Pij

probability refers to the probability that the process is in stat i and will be in state j in the next step. Here the

transition is one step and Pjk


is called one-step transition probability. Transition probability must satisfy

(i) Pij  0
an
d
(ii) 
j
Pij  1 .

The conditional probability


PrX nm  k | X n  j   is known as m step transition probability. That is, the
m 
P j
k
transition probability refers to the probability that the process is in stat i and will be in state j in the next m step. Here
m 
the transition is m step and P is called m -step transition probability. Transition probability must satisfy
j
k

m 
(i) P  0 an  
j
k d
(ii) j Pjk m  1.

Transition Probability Matrix (TPM):


The matrix of 1st order transition probability of a Markov chain is called transition probability matrix.

If X n , n  0 is a Markov chain with transition probability then transition probability matrix is given by

P jk
P1
P12  P1n 
1  P2n
P2 P22 
P  1 
 ; i, j  0, 1, 2, , n
   Pij 
  
P P  P
 n1 n2 nn 

Ex ample:
where , Pjk  0
n

and k

 Pjk  1
1
for all j

Consider a Markov Chain with the following TPM


3
 4 1 0
P 1 4 
4 1
2 
 4
4 1 

0 4 
3
4
with the initial distribution
 Px0  i   ; i  0,1,2, . Find
i
1
3
2  2 
1. p 01 , p 02

2. px n  1, x 0  0

3. p0 ,1,1  px 0  0, x1  1, x 2  1

4. p0 , 0 ,1,1  p x 0  0, x1  0, x 2  1, x 3  1

Solution:
1.

P 2 P P
3
4 1  3 
24 0  4 1 0
1
 4 1 1 4 1 
4 4 4 4
   2 
 3  1 
0 4 0 4
3 4 

1  4
4
5 5 1 
  5 8 116 316
16 2 16
 
3 9
 1 
 16 16 4 
5
 p 01 
2   1 |  0 
16
P x 2 x0
1
2 
p 02  2 |  0 
x0 16
2. We know, Px 2

px 2  1, x 0  0  Px 2  1 | x 0  0 Px 0  0


5 1 5
 16 3 48

3. We know,
P0 ,1,1  P x 0  0, x1  1, x 2  1
 Px1  1| x 0  0Px 2  1| x1  1Px 0  0
5 1 1 5
16 4 3 192

4. We know,
p0 , 0 ,1,1  p x 0  0, x1  0, x 2  1, x 3  1
 Px1  0 | x 0  0Px 2  1| x1  0Px 3  1| x 2  1Px 0  0
5 1 1 1 5
16 4 4 3 768

Classification of States According to Communication:


There are two types of state such as
i) Accessible State
ii) Not Accessible State
iii) Communicate State

i) Accessible State:
n 
State j of a Markov chain is said to be accessible from state i if Pi  0 for some n  1 . That is, state j
j
is accessible from state i if and only if, starting in i , it is possible that the process will ever enter state j . The
relation is denoted by i  j .

ii) Non-Accessible:
n 
If for all n , Pi  0 , then j is not accessible from i and it is denoted by i 
j
j . That is, a process started in i can never enter state j .

iii) Communicate State:


Two states i and j are said to be communicate state if each is accessible from the other, it is denoted by
n  m 
i  j ; then there exist integer m and n such that P  0 and P 0.
ij ji

You might also like