1 Introduction
1 Introduction
Introduction
Random Walk and Brownian motion processes : used in algorithmic trading. i.e., a process for executing
orders utilizing automated and pre-programmed trading instructions to account for variables such as price, timing
and volume.
Markov decision processes : commonly used in Computational Biology and Reinforcement Learning.
Gaussian Processes: used in regression and optimisation problems (eg. Hyper-Parameters tuning and Automated
Machine Learning).
Auto-Regressive and Moving average processes : employed in time-series analysis (eg. ARIMA models).
Denition. Variable whose possible values are numerical outcomes of a random phenomenon. There are two types:
discrete r.v. and continuous r.v.
Denition. A stochastic process {X(t), t ∈ T } is a collection of random variables. That is, for each t ∈ T , X(t) is a
random variable. The index t is often interpreted as time and, as a result, we refer to X(t) as the state of the process at
time t. The set T is called the index set of the process.
Examples:
Total number of customers that have entered a bank by time t
The total amount of sales that have been recorded in the market by time t
The number of packets in the buer of a statistical multiplexer at the arrival time of the nth customer, n = 1, 2, ...
1
Dr. L.S. Nawarathna, Department of Statistics & Computer Science, UoP
CHAPTER 1. INTRODUCTION 2
Examples:
Denition. Let {Xn , n = 0, 1, 2, ..., } be a stochastic process that takes on a nite or countable number of possible
values. Unless otherwise mentioned, this set of possible values of the process will be denoted by the set of nonnegative
integers {0, 1, 2, . . .}. If Xn = i, then the process is said to be in state i at time n.
We suppose that whenever the process is in state i, there is a xed probability Pij that it will next be in state j .
That is, we suppose that
for all states i0 , i1 , . . . , in−1 , i, j and all n ≥ 0. Such a stochastic process is known as a Markov chain .
For a Markov chain, the conditional distribution of any future state Xn+1 , given the past states X0 , X1 , . . . , Xn−1
and the present state Xn , is independent of the past states and depends only on the present state , Xn . The
value Pij represents the probability that the process will, when in state i, next make a transition into state j . Since
probabilities are nonnegative and since the process must make a transition into some state, we have
∞
Pij ≥ 0, i, j ≥ 0; Pij = 1, i = 0, 1, ...
j=0
Google PageRank: The algorithm Google uses to determine the order of search results, called PageRank, is a
type of Markov chain.The entire web can be thought of as a Markov model, where every web page can be a state,
and the links or references between these pages can be thought of as transitions with probabilities. So basically,
irrespective of which web page you start surng on, the chance of getting to a particular web page is a xed
probability.
Typing Word Prediction : Markov chains are known to be used for predicting upcoming words. They can also
be used in auto-completion and suggestions.
Text generator: Markov chains are most commonly used to generate dummy texts or produce extensive essays
and compile speeches. It is also used in the name generators that you see on the web.
To predict market trends: Markov chains and their respective diagrams can be used to model certain nancial
market climates' probabilities and thus predict the likelihood of future market conditions.
A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain.
Example 2. On any given day Gary is either cheerful (C), so-so (S), or glum (G). If he is cheerful today, then he will
be C, S, or G tomorrow with respective probabilities 0.5, 0.4, 0.1. If he is feeling so-so today, then he will be C, S, or
G tomorrow with probabilities 0.3, 0.4, 0.3. If he is glum today, then he will be C, S, or G tomorrow with probabilities
0.2, 0.3, 0.5. Find the transition probability matrix.
1. If we let the state at time n depend only on whether or not it is raining at time n, then the preceding model is not
a Markov chain. (why not?)
2. However, we can transform this model into a Markov chain by saying that the state at any time is determined by
the weather conditions during both that day and the previous day.
Denition. A Markov chain whose state space is given by the integers i = 0, ±1, ±2,... is said to be a random walk if,
for some number 0 < p < 1,
The preceding Markov chain is called a random walk for we may think of it as being a model for an individual walking
on a straight line who at each point of time either takes one step to the right with probability p or one step to the left
with probability 1 − p.
Examples:
Tracing the path taken by molecules when moving through a gas during the diusion process.
The winnings of a gambler who on each play of the game either wins or loses one dollar.
Pij1 = Pij
The ChapmanKolmogorov equations provide a method for computing these n-step transition probabilities .
These equations are
∞
Pijn+m = n m
Pik Pkj for all n, m ≥ 0, all i, j (1.3.2)
k=0
n m
Pik Pkj represents the probability that starting in i the process will go to state j in n + m transitions through a path
which takes it into state k at the nth transition.
If we let P (n) denote the matrix of n-step transition probabilities Pijn , then Equation (1.3.1) asserts that
(n)
P (n+m) = P .P (m)
Example. P (2) = P
(1+1)
= P .P = P 2
(n−1+1)
P (n) = P = P n−1 .P = P n
That is, the n-step transition matrix may be obtained by multiplying the matrix P by itself n times.
Example 4. Consider the previous Example in which the weather is considered as a two-state Markov chain. If α = 0.7
and β = 0.4, then calculate the probability that it will rain four days from today given that it is raining today.
Example 5. Consider Example 3. Given that it rained on Monday and Tuesday, what is the probability that it will
rain on Thursday?
⎡1 1 1
⎤
2 3 6
⎢ 2 ⎥.
Example 6. A Markov chain {Xn , n≥ 0} with states 0, 1, 2, has the transition probability matrix ⎣ 0 1
3 3⎦
If
1 1
2 0 2
P {X0 = 0} = P {X0 = 1} = 14 . Find E(X3 ).
Example 7. Suppose that coin 1 has probability 0.7 of coming up heads, and coin 2 has probability 0.6 of coming up
heads. If the coin ipped today comes up heads, then we select coin 1 to ip tomorrow, and if it comes up tails, then we
select coin 2 to ip tomorrow.
1. If the coin initially ipped is equally likely to be coin 1 or coin 2, then what is the probability that the coin ipped
on the third day after the initial ip is coin 1?
2. Suppose that the coin ipped on Monday comes up heads. What is the probability that the coin ipped on Friday
of the same week also comes up heads?