0% found this document useful (0 votes)
19 views

LECTURE NOTES 2 - Normal Distribution

Uploaded by

Agassi Murray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

LECTURE NOTES 2 - Normal Distribution

Uploaded by

Agassi Murray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 114

PROBABILITY DISTRIBUTIONS

SPECIFIC OBJECTIVES

► Upon completion of the Unit, the students should be able


to:
► 1.3 Define a discrete random variable.
► 1.4 Evaluate expected values and variances.
► 1.5 Define Bernoulli trials, Binomial and Poisson
► distributions
► 1.6 Determine expected value and variance from moment
► generating function.
► 1.7 Define the probabiliy density function and cumulative
► distribution function involving normal, exponential,
► gamma and Weibull distributions
RANDOM VARIABLE
► Definition Let E be an experiment and S a
sample space associated with E. A random
variable, X, is a real-valued function defined
on S.
► e.g. E : Toss two fair coins
S = { HH, HT, TH, TT}
For s in S, let X(s) = no. of heads in s.
X is random variable defined on S.
For s in S, let X(s) = no. of heads in s.

HH 0

HT
1
TH

TT 2
► We will denote by RX the range of X. RX is
the set of all possible values of X.
► Discrete Random Variable
► If RX the range of X is finite or countably
infinite X is called a discrete random
variable.
► Probability function of a Discrete
Random Variable
► Let X be a discrete random variable with
each possible outcome in the set
► RX= { X1, X2, X3,.. .}.

EXAMPLE
► E: Toss two fair dice and observe the total
of the numbers showing.
► RX = { 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}.
► Find the probability distribution.
TABLE OF OUTCOMES
6 7 8 9 10 11 12

5 6 7 8 9 10 11

4 5 6 7 8 9 10

3 4 5 6 7 8 9

2 3 4 5 6 7 8

1 2 3 4 5 6 7

+ 1 2 3 4 5 6
SOLUTION

The probability distribution

2 3 4 5 6 7 8 9 10 11 12
Expected Value ( or Expectation) of a random
variable

Properties of Expected Value

► If X = c (where c is a constant) , E(X) = c.


► E(k x) = k E(X), k is a constant.
► F or any two random variable X, Y we have
► E(X+Y) = E(X) + E(Y)
► E( X1+ X2+ … + Xn) = E( X1 ) +E( X2) + … + E(Xn)
► Where X1, X2, …,Xn are random variables on the
same sample space.
► If (X, Y) is a random variable with X and Y being
independent, the
► E(XY) = E(X)E(Y).
The Variance of a random variable, V(X)
► Definition: Let X be a random variable. The
variance of X (denoted by V(X) or σ2 is given by
► V(X) = E[X-E(X)]2
► The standard deviation of X = √ V(X) = σ
► The variance is a measure of the “spread” or
“dispersion” of the distribution of X relative to its
expected value. The spread increases as the
variance increases. The E(X) can be thought of as
the location of the centre of the distribution.
Effects of 2 on the Probability Density Function
of a Normal Random Variable
(Figure 6.9 (b))

0.4 Variance =
0.0625
0.3

0.2
Variance =
0.1 1

0.0

1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5


x
Properties of the variance

► V(X+c) = V(X), c is a constant.


► V(cX) = c2 V(X)
► If X and Y are independent, then
V(X + Y) = V(X) + V(Y)
► If X1, X2, …,Xn are n independent
random variables, then
V( X1+ X2+ … + Xn) = V( X1 ) +V( X2) + … +V(Xn)
EXAMPLE
► Let
X be the random variable the sum of the
numbers showing on the two dice. Find the
expectation and variance of X.
SOLUTION

The probability distribution

2 3 4 5 6 7 8 9 10 11 12


BINOMIAL DISTRIBUTION
► A binomial experiment is one that posses the
following properties:
1. The trials are carried out a fixed number of times
n.
2. The outcome of each trial can be classified into
two ‘types’ conveniently named ‘success’ or
‘failure’.
3. The probability p of a success remains constant
for each trial.
4. The individual trials are independent of each
other.
In general:

► Ifthe probability that an experiment results in a


successful outcome is p and the probability that the
outcome is a failure is 1-p and if X is the random
variable ‘ the number of successful outcomes in n
independent trials’ , then the p.d.f. of X is given by

n− x
P( X = x) = C p (1 − p)
n
x
x

where x = 0,1, 2,..., n


Proof:

Proof :(cont’d)

EXAMPLE
► In a box of floppy discs it is known that
60% will work. A sample of 10 of the discs
is selected at random. Find the probability
that
► (a) none of the sample will work,
► (b) less than 2 will work,
► (c ) at least 3 will work.
ANSWER

► Let X = the number of floppy discs


that will work
► n = 10, p = 0.60
P ( X = x ) = C xn p x (1 − p ) n − x
where x = 0,1, 2, ..., n
Number line

► ______________________________________
►0 1 2 3 4 5 6 7 8 9 10

► _______________________________________
►0 1 2 3 4 5 6 7 8 9 10
Number line


n− x
P( X = x) = C p (1 − p)
n
x
x

where x = 0,1, 2,..., n


10 −1
► P(X
10
= 1) = C
1 (0.6) (1 − 0.6)
1
= 0.002
ANSWER
► P(X < 2) = P(X =0) +P(X =1)
► = 0.0001 + 0.002
► = 0.0021
► P(X ≥ 3) = P(X =3) + P(X=4) + P(X=5) +
P(X=6) + P(X=7)+ P(X=8) +P(X=9) +
P(X=10)
► P(X ≥ 3) =
0.042+0.111+0.201+0.251+0.215+0.121
► +0.040+0.006 =0.987
► OR
The complement rule

P ( X = x) = C xn p x (1 − p ) n − x
where x = 0,1, 2,..., n

10 −1
► P(X = 1) = C (0.6) (1 − 0.6)
10
1
1
= 0.002

P(X=2) = C210 (0.6) 2 (1 − 0.6)10−2 = 0.011


ANSWER

► P(X ≥ 3) =1-[P(X =0) +P(X =1) + P(X=2)]


► = 1 –(0.0001 +0.002 +0.011)
► = 1-0.0131= 0.9869
Example
►A study has shown that 40% of all families
in a large neighbourhood have at least one
computer. Find the probability that, among 9
families randomly selected in the
neighbourhood,
► fewer than four have at least one computer
► at least two have at least one computer.
Number line


Number line


P ( X = x) = C xn p x (1 − p ) n − x
where x = 0,1, 2,..., n

9−0
► P(X=0) = 0
C 9
( 0. 4) 0
(1 − 0. 4) = 0.01

P(X = 1) = C19 (0.4)1 (1 − 0.4)9−1 = 0.06


9−2
P(X = 2) = C 2 (0.4) (1 − 0.4) = 0.161
9 2

P(X = 3) = C39 (0.4)3 (1 − 0.4)9−3 = 0.251


ANSWER
► Let X = the number of families that have
at least one computer
► n = 9, p = 0.40
► P( X < 4) = P(X= 0)
+P(X=1)+P(X=2)+P(X=3)
► P( X < 4) =0.010+0.060+0.161+0.251
► P( X < 4) =0.482
ANSWER
► P( X ≥2) = P(X = 2) + P(X=3)+P(X=4)+
P(X=5)+P(X=6) +P(X=7)+P(X=8)+P(X=9)
► P( X ≥2) =
0.161+0.251+0.251+0.167+0.074+0.021
► +0.004+0.000
► P( X ≥2) =0.929

► 0R
The complement rule

ANSWER

► P( X ≥2) = 1-[P(X=0)+P(X=1)]
► = 1-(0.01+0.06)
► = 1-0.07
► = 0.93
EXPECTATION AND VARIANCE
► Ifthe random variable X is such that
X~B(n,p), then
► E(X) = np
► Var(X) = np(1-p)
Proof:

Proof(cont’d)

Proof: (cont’d)

Proof: (cont’d)


A poisson experiment is one that posses
the following properties:
► The average number of successes, obtained in the
given time interval(or specified region) is known.
► The probability that a single success will occur
during a very short time interval or in a very small
region is proportional to the length of the time
interval or the size of the region and does not
depend on the number of successes occurring
outside this time interval or region.
► The probability of obtaining more than a single
success during such a short time interval or small
region is negligible.
The Poisson Distribution

► Experiments yielding values of a random


variable X, the number of successes
obtained during a given time interval or in a
given region, are often called Poisson
experiment. The poisson experiment is one
which possesses the following properties:
The p.d.f of the Poisson Distribution
► IfX is the random variable ‘the number of
successes obtained in a given time interval or
space region’ the p.d.f. X is given by
−
e  x
P( X = x) = , where  is the mean and x is
x!
the number of successes.
EXAMPLE
► An insurance company receives on average 2
claims per week from a certain factory. Assuming
that the number of claims follows a Poisson
distribution, find the probability that
► It receives more than three claims in a given week
► It receives more than 2 claims in a given fortnight
► It receives no claims on a given day
► (Assuming that the factory operates on a 5 -
day week.)
THE COMPLEMENT RULE



SOLUTION
►X = the number of claims it receives in a
► given week
► The mean, µ = 2
e−   x
P( X = x) = , where  is the mean and x is
x!
the number of successes.

P(X >3 ) = 1 – [ P(X=0) + P(X = 1) + P(X=2) + P(X = 3)]


e−   x
P( X = x) = , where  is the mean and x is
x!
the number of successes.

e −2 0 2
► P(X =0)= = 0.1353
0!
e −2 21
P(X =1) = = 0.2707
1!
e −2 2 2
P(X =2) = = 0.2707
2!

P(X =) = e −2 2 3
= 0.1804
3!

P(X > 3) = 1 – [0.1353 + 0.2707 + 0.2707 + 0.1804


= 1 – 0.8571
P(X > 3) = 0.1429
e−   x
P( X = x) = , where  is the mean and x is
x!
the number of successes.

e −4 4 0
P(X = 0) = = 0.0183
0!

e −4 41
P(X = 1) = = 0.0733
1!

P(X = 2) = e −4 4 2
= 0.1465
2!
SOLUTION(CONT’D)

►X = the number of claims it receives on a
► given day
► The mean, µ = 2/5 = 0.4
−0.4
► P( X = 0) = e (0.4) = 0.6703
0

0!
EXAMPLE

►A certain city has, on the average, 12


traffic deaths every three months.
Assuming that the number of traffic
deaths follows a Poisson distribution,
what is the probability that, in any
month, there will be
► fewer than three traffic deaths?
► more than two traffic deaths?
ANSWER
►X = the number of traffic deaths per
month
► The mean number of accidents per
month, µ=12/3 =4
► ___X__________________________
►0 1 2 3 4 5 6…..
THE COMPLEMENT RULE

ANSWER


EXPECTATION AND VARIANCE
► Ifthe random variable X~P0( µ), then
► E(X)=µ
► Var(X) = µ
Proof:

Proof:(cont’t)

Proof:(cont’d)

Proof:(cont’d)

Proof:(cont’d)

Moment Generating function

► Definition: The moment generating


function of a distribution of a random
variable, X, is defined as follows.
► M(t) = E(etx)
► In the discrete case
► M(t) = ∑etxf (xi) = ∑etxP(X=x)

► where f(x) = P(X=x) is the p.m.f of X.


Properties of the Moment
Generating Function
► 1.Uniqueness of the m.g.f. If X and Y are
two random variables with m.g.f’s, MX(t)
and MY(t) respecticvely. Then MX(t) = MY(t)
if and only if X and Y have the same
probability distribution.
► 2.dn/ dtn(MX(t) ]t=0 = MnX(0) = E(Xn)
i.e. Mn(0) is the nth moment of X about zero.
Properties of the Moment
generating function
► 3. If X and Y are independent random
variables with m.g.f’s, MX(t) and MY(t)
respectively, then the m.g.f. of Z = X+Y is
given by MZ(t) = MX(t) MY(t)
EXAMPLE

SOLUTION
► Given X ~ B(n,p)
n− x
► The p.m.f is P ( X = x) = C x p (1 − p )
n x

where x = 0,1, 2,..., n


SOLUTION

SOLUTION

SOLUTION

SOLUTION(CONT’D)

Continuous Random Variable

► If RX, the range of X, is of the form (c,d)


where c &/ or d may be infinite, we call X a
continuous random variable. e.g. RX = (-
1,1) or RX =(-,0).
► A variable that can assume one of an
infinitely large number of values, within
certain limitations.
Probability Density Function (p.d.f)

Shaded Area is the Probability That X is
Between a and b
(Figure 6.3)

0 a b x
EXAMPLE
► 1.
Let X be the life length of an electron
tube and suppose that X is a continuous
random variable with p.d.f .f (x) = b e-bx,
x ≥ 0. Let Pj = P( j ≤ X ≤ j +1). Find Pj .
solution
j +1 j +1

 
− bx
P( j  X  j + 1) = f ( x)dx = be dx
j j

= −e  
−bx j =1
J

1 1
= bj − b ( j +1)
e e
= (e b ) j (1 − e −b ); a = e −b
Cumulative Distribution Function

The cumulative distribution function, F(x), for a


continuous random variable X expresses the
probability that X does not exceed the value of x,
as a function of x
F ( x) = P( X  x)
The Exponential Distribution
The exponential random variable T (t>0) has a probability
density function
− t
f (t ) = e for t  0
Where  is the mean number of occurrences per unit time,
t is the number of time units until the next occurrence, and
e = 2.71828. . . Then T is said to follow an exponential
probability distribution.
The cumulative distribution function is
− t
F (t ) = 1 − e for t  0
The distribution has mean 1/ and variance 1/2
The cumulative distribution function



Expecrted Value( or Expectation), E (X) of a
Random Variable


Expected value, E(t)

Expected value, E(t)

Expected value, E(t) (cont’d)

Variance, V(t)

Variance, V(t)(cont’d)

Probability Density Function of the
Normal Distribution

The probability density function for a normally


distributed random variable X is

1 − ( x −  ) 2 / 2 2
f ( x) = e for -   x  
2 2

Where  and 2 are any number such that - <  <


 and - < 2 <  and where e and  are physical
constants, e = 2.71828. . . and  = 3.14159. . .
Effects of  on the Probability Density Function of a
Normal Random Variable
(Figure 6.9 (a))

0.4
Mean = 5 Mean = 6
0.3

0.2

0.1

0.0

1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5 x


Effects of 2 on the Probability Density Function of a
Normal Random Variable
(Figure 6.9 (b))

0.4 Variance =
0.0625
0.3

0.2
Variance =
0.1 1

0.0

1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5


x
EXAMPLE
The probability density function for a normally
distributed random variable X is

1 − ( x −  ) 2 / 2 2
f ( x) = e for -   x  
2 2
Where  and 2 are the mean and variance. Show
that
(i) ∫ f (x) dx = 1
(ii) E(X) = 
(iii) V(X) = 2
Proof (i)
► The random variable X has a p.d.f defined by
−( x− )2
1
f ( x) = e 2 2
, x 
2 
where
 and  are parameters

Show that  ( x− )2


1 −
I =
−
 2
e 2 2
dx = 1.
Let

Let

x−
t=

 t2
1 −
I = 
− 2 
e 2
dt
Proof: (cont’d)

1  

x2  

y2 
I2 =Letting
 e 2
dx    e 2
dy 
2 
− 
− 

  x2 + y2
1 −( )
=
2  e
− − 
2
dxdy

Letting

x = r cos  , y = r sin  , x + y = 1 = r ,
2 2 2
we have
Jacobian

Proof: (cont’d)

Result to be noted

That is

 t2
1 −

2
 e
−
2
dt = 1
proof(ii & iii)
► If the p.d.f of the random variable X is

−( x− )2
1
f ( x) = e 2 2
, x 
2 


E( X ) =  xf ( x ) dx
−
 ( x− )2
1 −
E( X ) =  2 2
x e dx
− 2 
SOLUTION


Let
x −
t =

= 
dx
dt
dx =  dt
SOLUTION

 t2
1 −
E( X ) =
2

−
(t +  ) e 2
dt

 t2  t2
 −  −
=
2

−
te 2
dt +
2

−
e 2
dt
SOLUTION

t2

2
te Is a odd function, therefore

 t2


−
te 2
dt = 0
SOLUTION

 t2
1 −

2
−
e 2
dt = 1
SOLUTION

The Moment Generating Function


EXAMPLE
► e.g.
Let X be a random variable such that
X ~ N(, 2 ) find the m.g.f of X.
SOLUTION


M X (t ) = 
−
e tx f ( x ) dx

 ( x− )2
1 −
M X (t ) = e 2
tx 2
e dx
− 2 
 ( x− )2 
  tx − 
1 
e
2
 
=  
2
dx
2  −
Let

, then

SOLUTION

s =
x − 

dx
x =  s + , =
ds
SOLUTION

 s 2 
  st +  t − 
1  
M X (t ) = e  2 
Then, ds
2 −
t  s2
e (st − )
=
2
e
−
2
ds

e t   1

 − ( s − t ) −
2 2 2
t 
=
2
e
−
 2
ds

t  2t 2   −1 2
e  2 ( s − t ) 
=
2
e 2
e
−
 
ds
SOLUTION

  2 2
t 
  t+   − y2
1  
M X (t ) = e
 2  2
e dy
2 −
  2 2
t 
 +   y2
  1 −
=e e
 2  2
dy
2 −
  2 2
t 
  t+ 
 
M X (t ) = e  2 
THE GAMMA FUNCTION, denoted by

x
p −1 −x
( p ) = e dx,
0
(i)( p + 1) = p( p)
(ii) If n is a positive int eger , then (n + 1) = n !
1
(iii)( ) =  .
2
D. PALMEER 113
THE GAMMA DISTRIBUTION
Let X be a continuous random variable with p.d.f given
by
 and r

  r −1 − x
r

 x e ,x 0
f ( x) =  (r )

 0 elsewhere

D. PALMEER 114

You might also like