LECTURE NOTES 2 - Normal Distribution
LECTURE NOTES 2 - Normal Distribution
SPECIFIC OBJECTIVES
HH 0
HT
1
TH
TT 2
► We will denote by RX the range of X. RX is
the set of all possible values of X.
► Discrete Random Variable
► If RX the range of X is finite or countably
infinite X is called a discrete random
variable.
► Probability function of a Discrete
Random Variable
► Let X be a discrete random variable with
each possible outcome in the set
► RX= { X1, X2, X3,.. .}.
►
EXAMPLE
► E: Toss two fair dice and observe the total
of the numbers showing.
► RX = { 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}.
► Find the probability distribution.
TABLE OF OUTCOMES
6 7 8 9 10 11 12
5 6 7 8 9 10 11
4 5 6 7 8 9 10
3 4 5 6 7 8 9
2 3 4 5 6 7 8
1 2 3 4 5 6 7
+ 1 2 3 4 5 6
SOLUTION
2 3 4 5 6 7 8 9 10 11 12
Expected Value ( or Expectation) of a random
variable
►
Properties of Expected Value
0.4 Variance =
0.0625
0.3
0.2
Variance =
0.1 1
0.0
2 3 4 5 6 7 8 9 10 11 12
►
►
BINOMIAL DISTRIBUTION
► A binomial experiment is one that posses the
following properties:
1. The trials are carried out a fixed number of times
n.
2. The outcome of each trial can be classified into
two ‘types’ conveniently named ‘success’ or
‘failure’.
3. The probability p of a success remains constant
for each trial.
4. The individual trials are independent of each
other.
In general:
n− x
P( X = x) = C p (1 − p)
n
x
x
► ______________________________________
►0 1 2 3 4 5 6 7 8 9 10
► _______________________________________
►0 1 2 3 4 5 6 7 8 9 10
Number line
►
n− x
P( X = x) = C p (1 − p)
n
x
x
10 −1
► P(X = 1) = C (0.6) (1 − 0.6)
10
1
1
= 0.002
►
Number line
►
P ( X = x) = C xn p x (1 − p ) n − x
where x = 0,1, 2,..., n
9−0
► P(X=0) = 0
C 9
( 0. 4) 0
(1 − 0. 4) = 0.01
► 0R
The complement rule
►
ANSWER
► P( X ≥2) = 1-[P(X=0)+P(X=1)]
► = 1-(0.01+0.06)
► = 1-0.07
► = 0.93
EXPECTATION AND VARIANCE
► Ifthe random variable X is such that
X~B(n,p), then
► E(X) = np
► Var(X) = np(1-p)
Proof:
►
Proof(cont’d)
►
Proof: (cont’d)
►
Proof: (cont’d)
►
A poisson experiment is one that posses
the following properties:
► The average number of successes, obtained in the
given time interval(or specified region) is known.
► The probability that a single success will occur
during a very short time interval or in a very small
region is proportional to the length of the time
interval or the size of the region and does not
depend on the number of successes occurring
outside this time interval or region.
► The probability of obtaining more than a single
success during such a short time interval or small
region is negligible.
The Poisson Distribution
e −2 0 2
► P(X =0)= = 0.1353
0!
e −2 21
P(X =1) = = 0.2707
1!
e −2 2 2
P(X =2) = = 0.2707
2!
P(X =) = e −2 2 3
= 0.1804
3!
e −4 4 0
P(X = 0) = = 0.0183
0!
e −4 41
P(X = 1) = = 0.0733
1!
P(X = 2) = e −4 4 2
= 0.1465
2!
SOLUTION(CONT’D)
►
►X = the number of claims it receives on a
► given day
► The mean, µ = 2/5 = 0.4
−0.4
► P( X = 0) = e (0.4) = 0.6703
0
0!
EXAMPLE
►
EXPECTATION AND VARIANCE
► Ifthe random variable X~P0( µ), then
► E(X)=µ
► Var(X) = µ
Proof:
►
Proof:(cont’t)
►
Proof:(cont’d)
►
Proof:(cont’d)
►
Proof:(cont’d)
►
Moment Generating function
0 a b x
EXAMPLE
► 1.
Let X be the life length of an electron
tube and suppose that X is a continuous
random variable with p.d.f .f (x) = b e-bx,
x ≥ 0. Let Pj = P( j ≤ X ≤ j +1). Find Pj .
solution
j +1 j +1
− bx
P( j X j + 1) = f ( x)dx = be dx
j j
= −e
−bx j =1
J
1 1
= bj − b ( j +1)
e e
= (e b ) j (1 − e −b ); a = e −b
Cumulative Distribution Function
►
►
Expecrted Value( or Expectation), E (X) of a
Random Variable
►
Expected value, E(t)
►
Expected value, E(t)
►
Expected value, E(t) (cont’d)
►
Variance, V(t)
►
Variance, V(t)(cont’d)
►
Probability Density Function of the
Normal Distribution
1 − ( x − ) 2 / 2 2
f ( x) = e for - x
2 2
0.4
Mean = 5 Mean = 6
0.3
0.2
0.1
0.0
0.4 Variance =
0.0625
0.3
0.2
Variance =
0.1 1
0.0
1 − ( x − ) 2 / 2 2
f ( x) = e for - x
2 2
Where and 2 are the mean and variance. Show
that
(i) ∫ f (x) dx = 1
(ii) E(X) =
(iii) V(X) = 2
Proof (i)
► The random variable X has a p.d.f defined by
−( x− )2
1
f ( x) = e 2 2
, x
2
where
and are parameters
Let
x−
t=
t2
1 −
I =
− 2
e 2
dt
Proof: (cont’d)
1
−
x2
−
y2
I2 =Letting
e 2
dx e 2
dy
2
−
−
x2 + y2
1 −( )
=
2 e
− −
2
dxdy
Letting
x = r cos , y = r sin , x + y = 1 = r ,
2 2 2
we have
Jacobian
►
Proof: (cont’d)
►
Result to be noted
That is
t2
1 −
2
e
−
2
dt = 1
proof(ii & iii)
► If the p.d.f of the random variable X is
−( x− )2
1
f ( x) = e 2 2
, x
2
E( X ) = xf ( x ) dx
−
( x− )2
1 −
E( X ) = 2 2
x e dx
− 2
SOLUTION
Let
x −
t =
=
dx
dt
dx = dt
SOLUTION
t2
1 −
E( X ) =
2
−
(t + ) e 2
dt
t2 t2
− −
=
2
−
te 2
dt +
2
−
e 2
dt
SOLUTION
t2
−
2
te Is a odd function, therefore
t2
−
−
te 2
dt = 0
SOLUTION
t2
1 −
2
−
e 2
dt = 1
SOLUTION
►
The Moment Generating Function
►
EXAMPLE
► e.g.
Let X be a random variable such that
X ~ N(, 2 ) find the m.g.f of X.
SOLUTION
M X (t ) =
−
e tx f ( x ) dx
( x− )2
1 −
M X (t ) = e 2
tx 2
e dx
− 2
( x− )2
tx −
1
e
2
=
2
dx
2 −
Let
, then
SOLUTION
s =
x −
dx
x = s + , =
ds
SOLUTION
s 2
st + t −
1
M X (t ) = e 2
Then, ds
2 −
t s2
e (st − )
=
2
e
−
2
ds
e t 1
− ( s − t ) −
2 2 2
t
=
2
e
−
2
ds
t 2t 2 −1 2
e 2 ( s − t )
=
2
e 2
e
−
ds
SOLUTION
2 2
t
t+ − y2
1
M X (t ) = e
2 2
e dy
2 −
2 2
t
+ y2
1 −
=e e
2 2
dy
2 −
2 2
t
t+
M X (t ) = e 2
THE GAMMA FUNCTION, denoted by
x
p −1 −x
( p ) = e dx,
0
(i)( p + 1) = p( p)
(ii) If n is a positive int eger , then (n + 1) = n !
1
(iii)( ) = .
2
D. PALMEER 113
THE GAMMA DISTRIBUTION
Let X be a continuous random variable with p.d.f given
by
and r
r −1 − x
r
x e ,x 0
f ( x) = (r )
0 elsewhere
D. PALMEER 114