Week2 Lecture2 PDF
Week2 Lecture2 PDF
Semester 191
Week-2, Lecture-2
k 2npq
2
Binomial Random Variable - The Normal Approximation
4
Binomial Random Variable - The Normal Approximation
Example: A fair coin is tossed 5,000 times. Find the probability that
the number of heads is between 2,475 to 2,525.
Solution: We need P(2,475 X 2,525). Here n is large so that we
1
can use the normal approximation. In this case p 2 , so that np 2,500
and npq 35. Since np npq 2,465, and np npq 2,535,
the approximation is valid for k1 2,475 and k 2 2,525. Thus
P k1 X k 2
x2 1 y2 / 2
e dy.
x1 2
k 1 np 5 k 2 np 5
Here x 1 0.71428, x 2
7
0.71428.
7
npq npq 5
1 x 1
y 2 /2
erf (x ) e dy G (x ) , x 0
2 0 2
1
where G x
x
y 2 /2
e dy
2 -
6
Binomial Random Variable - The Normal Approximation
Since x1 0, the above probability is given by
P2,475 X 2,525 erf ( x 2 ) erf ( x1 ) erf ( x 2 ) erf (| x1 |)
5
2erf 0.516,
7
where we have used Table erf (0.7) 0.258 .
1 1
e x / 2
2
e x / 2
2
2 2
x x
x1 x2 x1 x2
(a) x1 0, x2 0 (b) x1 0, x2 0
7
Binomial RandomVariable-The Poisson Approximation
8
Binomial Random Variable-The Poisson Approximation
10
1 2 n
0 T
Binomial RandomVariable-The Poisson Approximation
n( n 1) ( n k 1) ( np) k n k
Pn ( k ) (1 np / n )
nk k!
1 2 k 1 k (1 / n ) n
1 1 1 .
n n n k! (1 / n ) k
Thus k
lim Pn ( k ) e ,
n , p 0 , np k!
1 2 k 1
k
since the finite products 1 1 1 as well as 1
n n n n
tend to unity as n , and
n
lim1 e .
n
n
The right side of above represents the Poisson p.m.f and the Poisson approximation
to the binomial r.v. is valid in situations where the binomial r.v parameters n and p
diverge to two extremes ( n , p 0) such that their product np is a constant.
11
Binomial RandomVariable-The Poisson Approximation
12
Binomial RandomVariable-The Poisson Approximation
Here n 100,and the number of winning tickets X in the n purchased tickets has an
approximate Poisson distribution with parameter
np 100 5 10 5 0.005.
k
Thus P( X k ) e ,
k!
and (a) Probability of winning P( X 1) 1 P( X 0) 1 e 0.005.
(b) In this case we need P( X 1) 0.95.
P( X 1) 1 e 0.95 implies ln 20 3.
2 4 2
1 e 1 2 2 0.052.
3 3
14
Conditional Probability Density Function
we may define the conditional distribution of the r.v. X given the event
B as
P X ( ) x B
FX ( x | B ) P X ( ) x | B .
P( B )
15
Conditional Probability Density Function
P x1 X ( ) x2 | B
x2
x1
f X ( x | B )dx.
Conditional Probability Density Function
1
q
x
1
(a)
Conditional Probability Density Function
a x a x
P X ( ) x a X ( ) b
Solution:
FX ( x | B ) P X ( ) x | B
P a X ( ) b
P X ( ) x a X ( ) b
.
FX (b) FX ( a )
For x a, we have X ( ) x a X ( ) b , and hence
FX ( x | B) 0.
Conditional Probability Density Function
f X ( x | B)
f X (x )
x
a b
Conditional pdf, Bayes’ theorem, a-priori probability and
a-posteriori probability
We can use the conditional p.d.f. together with the Bayes’ theorem to update
our a-priori knowledge about the probability of events in the presence of new
observations. Ideally, any new information should be used to update our
knowledge. As we see in the next example, conditional p.d.f. together with
Bayes’ theorem allow systematic updating. For any two events A and B, Bayes’
theorem gives
P ( B | A) P ( A)
P( A | B ) .
P( B )
Let B x1 X ( ) x2 so that above becomes
P ( x1 X ( ) x2 ) | AP ( A)
PA | ( x1 X ( ) x2 )
P x1 X ( ) x2
x2
F ( x | A) FX ( x1 | A)
X 2 P ( A)
x1
f X ( x | A)dx
P ( A).
FX ( x2 ) FX ( x1 ) x2
x1
f X ( x )dx
Conditional pdf, Bayes’ theorem, a-priori probability and
a-posteriori probability
lim PA | ( x X ( ) x ) P A | X ( ) x
f X ( x | A)
P ( A).
0 f X ( x)
or P( A | X x ) f X ( x )
f X | A ( x | A) .
P( A)
We also get P ( A) f X ( x | A)dx P ( A | X x ) f X ( x )dx,
1
or P ( A)
P ( A | X x ) f X ( x )dx
P( A | X x ) f X ( x )
f X | A ( x | A)
.
Thus we get the desired result
P ( A | X x ) f X ( x )dx
24
Conditional pdf, Bayes’ theorem, and a-priori probability
f P| A ( p | A) P
P ( A)
( n 1)! p
p k q n k , 0 p 1 ( n, k ). 0 1
( n k )! k!
Notice that the a-posteriori p.d.f of p in is not a uniform distribution, but a
beta distribution.
26
Conditional pdf, Bayes’ theorem, a-priori probability and
a-posteriori probability
We can use this a-posteriori p.d.f. to make further predictions, For example,
in the light of the above experiment, what can we say about the probability of
a head occurring in the next (n+1)th toss?
Let B = “head occurring in the (n+1)th toss, given that
k heads have occurred in n previous tosses”.
1
Clearly P( B | P p ) p, and P ( B ) P ( B | P p ) f P ( p | A)dp.
0
Notice that we have used the a-posteriori p.d.f to reflect our knowledge about the
experiment already performed. We get
1 ( n 1)! k n k k 1
P( B ) p p q dp .
0 ( n k )! k! n2
Thus, if n =10, and k = 6, then
7
P( B ) 0.58,
12
which is more realistic compare to p = 0.5. 27
Conditional pdf, Bayes’ theorem, a-priori probability and
a-posteriori probability