0% found this document useful (0 votes)
44 views4 pages

1 Introduction

This document introduces stochastic processes and Markov chains. It defines a stochastic process as a collection of random variables indexed by time or space. Examples of commonly used stochastic processes are given, including Poisson processes, random walks, and Gaussian processes. Markov chains are introduced as a type of stochastic process where the conditional probability of the next state depends only on the present state. Real-world applications of Markov chains are listed, such as Google PageRank, word prediction, and modeling financial markets. Transition probability matrices are used to represent the one-step state transition probabilities in a Markov chain. Examples are provided to illustrate Markov chains and how to transform a non-Markov process into a Markov chain.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views4 pages

1 Introduction

This document introduces stochastic processes and Markov chains. It defines a stochastic process as a collection of random variables indexed by time or space. Examples of commonly used stochastic processes are given, including Poisson processes, random walks, and Gaussian processes. Markov chains are introduced as a type of stochastic process where the conditional probability of the next state depends only on the present state. Real-world applications of Markov chains are listed, such as Google PageRank, word prediction, and modeling financial markets. Transition probability matrices are used to represent the one-step state transition probabilities in a Markov chain. Examples are provided to illustrate Markov chains and how to transform a non-Markov process into a Markov chain.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Chapter 1

Introduction

1.1 Stochastic process


A stochastic process is a set of random variables indexed by time or space. Stochastic modelling is an interesting and
challenging area of probability and statistics that is widely used in the applied sciences. Some examples of stochastic
processes used in real-world applications are as follows.
ˆ Poisson processes: for dealing with waiting times and queues.

ˆ Random Walk and Brownian motion processes : used in algorithmic trading. i.e., a process for executing
orders utilizing automated and pre-programmed trading instructions to account for variables such as price, timing
and volume.
ˆ Markov decision processes : commonly used in Computational Biology and Reinforcement Learning.

ˆ Gaussian Processes: used in regression and optimisation problems (eg. Hyper-Parameters tuning and Automated
Machine Learning).
ˆ Auto-Regressive and Moving average processes : employed in time-series analysis (eg. ARIMA models).

Denition. Variable whose possible values are numerical outcomes of a random phenomenon. There are two types:
discrete r.v. and continuous r.v.
Denition. A stochastic process {X(t), t ∈ T } is a collection of random variables. That is, for each t ∈ T , X(t) is a
random variable. The index t is often interpreted as time and, as a result, we refer to X(t) as the state of the process at
time t. The set T is called the index set of the process.
Examples:
ˆ Total number of customers that have entered a bank by time t

ˆ The number phone calls occurring in a certain period of time t

ˆ The total amount of sales that have been recorded in the market by time t

1.1.1 Discrete-time stochastic process


When T is a countable set, the stochastic process is said to be a discrete-time stochastic process.
eg: {Xn , n = 0, 1, ...} is a discrete-time stochastic process indexed by the nonnegative integers.
Examples:
ˆ The number of occupied channels in a telephone link at the arrival time of the nth customer, n = 1, 2, ...

ˆ The number of packets in the buer of a statistical multiplexer at the arrival time of the nth customer, n = 1, 2, ...

1
Dr. L.S. Nawarathna, Department of Statistics & Computer Science, UoP
CHAPTER 1. INTRODUCTION 2

1.1.2 Continuous-time stochastic process


If T is an interval of the real line, the stochastic process is said to be a continuous-time process.
eg: {X(t), t ≥ 0} is a continuous-time stochastic process indexed by the nonnegative real numbers.

Examples:

ˆ The number of occupied channels in a telephone link at time t > 0

ˆ The number of packets in the buer of a statistical multiplexer at time t > 0

1.1.3 The state space of a stochastic process


The state space of a stochastic process is dened as the set of all possible values that the random variables X(t) can
assume. Thus, a stochastic process is a family of random variables that describes the evolution through time of some
(physical) process.

1.2 Markov Chains


Markov models describe the world in a more realistic way and are a useful tool to make long-term predictions about a
system or process.

Denition. Let {Xn , n = 0, 1, 2, ..., } be a stochastic process that takes on a nite or countable number of possible
values. Unless otherwise mentioned, this set of possible values of the process will be denoted by the set of nonnegative
integers {0, 1, 2, . . .}. If Xn = i, then the process is said to be in state i at time n.

We suppose that whenever the process is in state i, there is a xed probability Pij that it will next be in state j .
That is, we suppose that

P {Xn+1 = j|Xn = i, Xn−1 = in=1 , ..., X1 = i1 , X0 = i0 } = Pij (1.2.1)

for all states i0 , i1 , . . . , in−1 , i, j and all n ≥ 0. Such a stochastic process is known as a Markov chain .

For a Markov chain, the conditional distribution of any future state Xn+1 , given the past states X0 , X1 , . . . , Xn−1
and the present state Xn , is independent of the past states and depends only on the present state , Xn . The
value Pij represents the probability that the process will, when in state i, next make a transition into state j . Since
probabilities are nonnegative and since the process must make a transition into some state, we have


Pij ≥ 0, i, j ≥ 0; Pij = 1, i = 0, 1, ...
j=0

1.2.1 Markov Chain Applications


Following are a list of real-world applications of Markov chains.

ˆ Google PageRank: The algorithm Google uses to determine the order of search results, called PageRank, is a
type of Markov chain.The entire web can be thought of as a Markov model, where every web page can be a state,
and the links or references between these pages can be thought of as transitions with probabilities. So basically,
irrespective of which web page you start surng on, the chance of getting to a particular web page is a xed
probability.

ˆ Typing Word Prediction : Markov chains are known to be used for predicting upcoming words. They can also
be used in auto-completion and suggestions.

ˆ Text generator: Markov chains are most commonly used to generate dummy texts or produce extensive essays
and compile speeches. It is also used in the name generators that you see on the web.

Dr. L.S. Nawarathna, Department of Statistics & Computer Science, UoP


CHAPTER 1. INTRODUCTION 3

ˆ To predict market trends: Markov chains and their respective diagrams can be used to model certain nancial
market climates' probabilities and thus predict the likelihood of future market conditions.

ˆ A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain.

Let P denote the matrix of one-step transition probabilities Pij , so that


 
P .
 00 P01 P02 . . 
 
P10 P11 P12 . . .
 
 . . . 
 
 
 . . . 
P =  .


 . . 
 
 Pi0 Pi1 Pi2 . . .
 
 . . . 
 
 
 . . . 

Example 1. (Forecasting the Weather)


Suppose that the chance of rain tomorrow depends on previous weather conditions only through whether or not it is
raining today and not on past weather conditions. Suppose also that if it rains today, then it will rain tomorrow with
probability α; and if it does not rain today, then it will rain tomorrow with probability β . Find the transition probability
matrix.

Example 2. On any given day Gary is either cheerful (C), so-so (S), or glum (G). If he is cheerful today, then he will
be C, S, or G tomorrow with respective probabilities 0.5, 0.4, 0.1. If he is feeling so-so today, then he will be C, S, or
G tomorrow with probabilities 0.3, 0.4, 0.3. If he is glum today, then he will be C, S, or G tomorrow with probabilities
0.2, 0.3, 0.5. Find the transition probability matrix.

Example 3. (Transforming a Process into a Markov Chain)


Suppose that whether or not it rains today depends on previous weather conditions through the last two days. Specically,
suppose that if it has rained for the past two days, then it will rain tomorrow with probability 0.7; if it rained today
but not yesterday, then it will rain tomorrow with probability 0.5; if it rained yesterday but not today, then it will rain
tomorrow with probability 0.4; if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.

1. If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not
a Markov chain. (why not?)

2. However, we can transform this model into a Markov chain by saying that the state at any time is determined by
the weather conditions during both that day and the previous day.

1.2.2 A Random Walk Model


A Random Walk can be any sequence of discrete steps (of always the same length) moving in random directions.
Random Walks can take place in any type of dimensional space (eg. 1D, 2D, nD).

Denition. A Markov chain whose state space is given by the integers i = 0, ±1, ±2,... is said to be a random walk if,
for some number 0 < p < 1,

Pi,i+1 = p = 1 − Pi,i−1 ; i = 0, ±1, ±2, ...

The preceding Markov chain is called a random walk for we may think of it as being a model for an individual walking
on a straight line who at each point of time either takes one step to the right with probability p or one step to the left
with probability 1 − p.

Dr. L.S. Nawarathna, Department of Statistics & Computer Science, UoP


CHAPTER 1. INTRODUCTION 4

Examples:
ˆ Tracing the path taken by molecules when moving through a gas during the diusion process.

ˆ Sports events predictions.

ˆ The wanderings of a drunken man as he walks along a straight line.

ˆ The winnings of a gambler who on each play of the game either wins or loses one dollar.

1.3 ChapmanKolmogorov Equations


Dene the n-step transition probabilities Pijn to be the probability that a process in state i will be in state j after n
additional transitions. That is,
Pijn = P {Xn+k = j|Xk = i} , n ≥ 0, i, j ≥ 0 (1.3.1)

Pij1 = Pij

The ChapmanKolmogorov equations provide a method for computing these n-step transition probabilities .
These equations are


Pijn+m = n m
Pik Pkj for all n, m ≥ 0, all i, j (1.3.2)
k=0
n m
Pik Pkj represents the probability that starting in i the process will go to state j in n + m transitions through a path
which takes it into state k at the nth transition.
If we let P (n) denote the matrix of n-step transition probabilities Pijn , then Equation (1.3.1) asserts that

(n)
P (n+m) = P .P (m)

Example. P (2) = P
(1+1)
= P .P = P 2
(n−1+1)
P (n) = P = P n−1 .P = P n

That is, the n-step transition matrix may be obtained by multiplying the matrix P by itself n times.

Example 4. Consider the previous Example in which the weather is considered as a two-state Markov chain. If α = 0.7
and β = 0.4, then calculate the probability that it will rain four days from today given that it is raining today.

Example 5. Consider Example 3. Given that it rained on Monday and Tuesday, what is the probability that it will
rain on Thursday?
⎡1 1 1

2 3 6
⎢ 2 ⎥.
Example 6. A Markov chain {Xn , n≥ 0} with states 0, 1, 2, has the transition probability matrix ⎣ 0 1
3 3⎦
If
1 1
2 0 2
P {X0 = 0} = P {X0 = 1} = 14 . Find E(X3 ).

Example 7. Suppose that coin 1 has probability 0.7 of coming up heads, and coin 2 has probability 0.6 of coming up
heads. If the coin ipped today comes up heads, then we select coin 1 to ip tomorrow, and if it comes up tails, then we
select coin 2 to ip tomorrow.

1. If the coin initially ipped is equally likely to be coin 1 or coin 2, then what is the probability that the coin ipped
on the third day after the initial ip is coin 1?

2. Suppose that the coin ipped on Monday comes up heads. What is the probability that the coin ipped on Friday
of the same week also comes up heads?

Dr. L.S. Nawarathna, Department of Statistics & Computer Science, UoP

You might also like