0% found this document useful (0 votes)
261 views

Unit-9 IGNOU STATISTICS

- The document discusses three standard probability distributions: the geometric, negative binomial, and Poisson distributions. - It defines the geometric distribution as the number of Bernoulli trials needed for the first success, where the probability of success on each trial is p. The mean and variance of the geometric distribution are 1/p and (1-p)/p^2, respectively. - The document also provides examples of calculating probabilities using the geometric distribution and derives the moment generating function of the geometric distribution.

Uploaded by

Carbideman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
261 views

Unit-9 IGNOU STATISTICS

- The document discusses three standard probability distributions: the geometric, negative binomial, and Poisson distributions. - It defines the geometric distribution as the number of Bernoulli trials needed for the first success, where the probability of success on each trial is p. The mean and variance of the geometric distribution are 1/p and (1-p)/p^2, respectively. - The document also provides examples of calculating probabilities using the geometric distribution and derives the moment generating function of the geometric distribution.

Uploaded by

Carbideman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

- --- -

UNIT 9 STANDARD PROBABILITY


DISTRIBUTIONS :PART I1 - '

Structure
9.1 Introduction
Objectives
9.2 The Geometric Distribution
9.3 The Negative Binomial Distribution
9.4 The Poisson Distribution
9.5 Summary
9.6 Solutions and Answers

9.1 INTRODUCTION

The standard probability distributions that we studied in Unit 8 are all distributionsof r.vs.
which assume a finite number of values. However, there are many situations of practical as
well as theoretical interest which require the use of r.vs. whose values can be arranged in an
unending sequence. The simplest such cases are'of those r.vs. which assume the values 0,1,
.
2, . . , i.e., those which are non-negative, integer-valued r.vs.
The usual coin tossing experiment provides an exarnple.of this type. Suppose we toss a coin
until a head turns up, and denote by X the number of tosses required for the purpose. Then
X = 1,2, . . ., and, .in general, we cannot specify an upper bound k such that P[X < k] = 1.
An obvious extension of the above exwple is the following. Suppose we decide to toss the
coin until a specified number, r say, of heads turn up. In this situation, the number X of
tosses required is r, r + 1, r + 2, . . . .
Although both these illustrations seem mainly to be of theoretical interest, they are useful in
many statistical and probabilistic problems of an advanced nature. Since they are concerned
with waiting times (number of trials) required for the first or r-th occurrence of a specific
event, the associated distributions are called waiting time distributions. We shall discuss
two simple waiting time distributions in this unit : the geometric distribution and the
negative binomial distribution.
The situation described below is of a different type. Nevertheless, it also leads to a r.v. with
infinitely many values.
A radioactive substance emits particles called a- particles. The number of a-particles
emitted during a time interval of one hour, say, can be recorded by an instrument. The
number X of such particles can be 0,1.2, . . . . The r.v. in this follows Poisson distribution.
In this unit we shall also be discussing the properties of the Poisson distribution.

Objectives
After reading this unit you should be able to :
define the geometric, negative binomial and Poisson distributions
calculate the mean and variance of these distributions
compute probabilities of events associated with these standard distributions.

In this section we'll discuss the geometric distribution. Let us see first how suh a distribution
arises.
Probability on Discrete h m p l e Let p denote the probability of a success in a Bernoulli trial, 0 < p < 1. Consider independent
Spew repetitions of such a trial. Denote by X, the number of trials required for first success. Then
the r.v. X takes the values 1,2,3, ...
and by definition

In order to obtain P[X = j] for j 2 2, observe that the event [X = j] occurs iff, the first j - 1
trials result in a failure and the j-th tria1.i~a success. The probability that we have first - 1)
failure followed by a success is

by virtue of independence of the repeated Bernoulli trials. The r.v X here is said to have a
geometric distribution. Here is the formal definition.

Definition 1 :A r.v. X is said to have the geomeSric distribution with parameter p, 0 < p < 1,
if its p.m.f. is given by
~ [ x = j ] = ~- p( )l J - l , j = 1,2,.... . . . (1)
The distribution derives its name from the fact that P[X = j] is the j-th term of the geometric
series

For p E ]O,l[, the above infinite series is convergent and its sum is

which is what is required.


A slightly different way of aniving at the geometric distribution is to consider a sequence
(Y,, n 2 1 ) of independent and identically distributed Bernoulli r.vs., such that

for all n 2 1. Identify Y, = 1 with success at the n-th Bernoulli ma1 and Y, = 0 with failure
at the n-th trial. Then the event [X = j] is the same as the event [Y, = 0, . . . ,Yj - I = 0,
R e a l l that if a + + 3 . . . isa
+
Yj = 11, and hence, by virtue of independence of Y, s,
convergent geometric series, then its
sum is 3. P[X = j] = PIYI = 01 P[Y2 = 01 . . . P[Yj - = 01 P[Yj = 11

Now let's see some examples of this distribution.

Example 1 :The probability is 0.70 that a candidate will pass an examination. Suppose we
want to find the probability that he will pass the examination at the fourth attempt.
Assuming that the successive attempts of the candidate are independent repetitions of a
Bernoulli trial with p = 0.70, the required probability is
P[X = 41 = 0.70 (1 - 0.70)~

Actually, the assumptions made in Example 1 are not very realistic. In particular, they imply .
that the candidate learns nothing from his first three failures.
Example 2 :Let { Y,, n = 1,2, . . .1 be a sequence of independent and identically distributed
r,vs. (i.i.d.r.vs.), such that forall n 2 1,
P[Y, = 01 = 114, P[Y, = 11 = 114, P[Y, = 21 = 112.
So, each Y, can take the values 0, 1,2. Consider a sequence of observed values of
Y ; ,Y2 , . . . Let X be the number of Y, s that need to be observed to obtain the first 0 in this
sequence. Let us find the probability that X > 4.
We say that a success occurs at trial number n if Y, = 0. In view of the identical nature of
the distribution of Y,, the probability of a success at any trial is p = 114. The r.v. X therefore
hslc the oenmptric rlicrrih~ltinnwith n = 1 /A We need tn rnmni~te
Standard Probability
Distributions :Part I1

Now here are some simple exercises for you to solve.

E l ) Obtain the probability that in independent tosses of a balanced die, we will have to
wait for at least 5 tosses to obtain the first six.
E2) Cards are drawn at random and with replacement from a well- shuffled pack of 52
playing cards. Find the probability that the first ace will appear before the fifth
selection.

We shall now study the properties of the probability distribution of X specified by (1).
The following theorem gives the mean and variance of X.

Theorem 1 :If the r.v. X has geometric distribution with p.m.f. specified by (I), its mean
and variance are
1
E(X) = - , Var (X) = (1 - P)
P p2
Proof: By definition

To sum the infinite series SI= xm

j=l
j q j - ', note that

Hence. S1= llp2 and therefore, E(X) = llp.

The variance, Var (X), will be obtained by employing the familiar technique of writing
2
Var (X) = E[X(X - I)] + E(X) - {E(x)} .
It is, therefore, enough to compute
Probability on Disrrete Sample In order to sum the infinite series
SpPces

observe that

Hence

= 2P ,since we have already seen that Sl = 7.


P
1

Therefore. s2= $.
P
Thus, finally,

and therefore,

which completes the proof of the theorem.


We now obtain the moment generating function of the geometric distribution. We'll use
the next section while discussing the so-called negative binomial distribution.
Theorem 2 :Let X be a r.v. with geometric distribution specified by the p.m.f. (1). Its
m.g.f. is

!P)
valid for all t such that t c In -
(I
Proof: By definition
M, (t) = ~[e"]
Standnrd Probability
Dlstrlbutlons :Part I1

I
which is valid only if (1 - p)et < 1 or t < In [I---
p) This is so, because only when

t < In (&) the infinite series defining M, (t) is absolutely convergent.

We conclude this section with an interesting'property of the geometric distribution.


Let X be a geometric r.v. with parameter p. Then for any positive integer j,

;I! Consider the event [X > j + k], where k is also a positive integer. Since X > j + k implies
that X > j,
[X>j+k] n[X>j]=[X>j+k].
Let us now evaluate the conditional probability that the waiting time for first success
exceeds j + k, given that it exceeds j; i.e. we wish to evaluate P[X > j + k I X > j]. By
definition,

= P[X > k].


Thus, we have shown that for all positive integers j and k
P[X>j+k I X>j]=P[X>k]
i.e., the conditional probability that the waiting time to first success exceedsj + k, given that
it exceeds j, is the same as the probability that it exceeds k. In other words, the fa$ that we
have waited for at least j trials for the first success does not affect the probability that we
will have to wait for a further k trials. This property is therefore called the lack of memory
property of the geometric distribution, or its forgetfulness property. In fact, the geometric
distribution is the only distribution on the set of non-negative intkgers with the lack of
memory property. This has important consequences in the study of more complicated
systems of r.vs. called Markov chains. But we cannot go into its details in this course.
In this section we have seen how geometric distribution arises. We have also derived some
properties of this distribution. In particular, we have noted that this is the only distribution
with the forgetfulness property.
Probability on Discrete Sample We'll take up the study of the negative binomial distribution in the next section.
Spaces

9.3 THE NEGATIVE BINOMIAL DISTRIBUTION -

This section discusses the properties of the so-called negative binomial distribution which is
a generalisation of the geometric distribution. You know that the geometric distribution
gives the distribution of the number of trials required to obtain the first success in
independent repetitions of a Bernoulli trial. Now suppose we want to find the distribution of
the number of trials required to obtain the r-th success in independent repetitions of a
Bernoulli trial with probability p of success at every trial. If X denotes this r.v., can you list
the values taken by X ? X takes values r, r + 1, . . . We wish to obtain P[X = j] for j 2 r.
The event [X = j] occurs iff there an? (r - 1) successes in the first (j - 1) trials and the j-th
trial results in a success. In view of independence of the successive trials,
P[X = j] = P[There are (r - 1)successes in the first (j - 1) trials and the j-th trial results in a
success] . .

= P[There are (r - 1) successes in the first Cj - I ) trials] x P[The j-th trial results in a
success].
Now, recall the argument which we used to find the probabilities felated to the binomial
distribution (Sec. 8.3). By a similar argument we get
P[There are (r - 1)successes in the first j - I trials]

=~~~]pr-l(l-pj-r,j>r.

Moreover,
PLjth trial results in a success] = p.
Hence,

P[X = j] =

=~I~)pr~l'-pj-r,j=r,r+l,i..

This lkads us to the following definition.


Definition 2 :A r.v. X has negative binomial distribution with parameters (r, p), r a
positive integer and 0 c p < 1, if the p.m.f. of X is given by

Now let us verify that

for all positive integral r and 0 < p < 1. We do this in Theorem 3. But before that we need
some preparation.

stands for the number of ways of choosing j objects out of n

You know that


\ /
when n is replaced by any real number a,say, -
,
-
positive integer and j is a non-negative integer. We want to
c a < -.
'
You wi:l &greethat the right side of (5) makes sense even if n is not a positive integer. We Standard Probability
therefore define Distributions :Part I1

-m ca< m, j being a non-negative integer.

The advantage of this extension is that we can write down the expansion,

\ I \ I
which is valid for all real a and -1 c t c 1. Formula (7) is known as Newton's binomial
formula.

If a is a positive integer n, the right side of (7) consists of (n + 1) terms, since


j > n. In fact, in this case, (7) is the usual binomial expansion of (I
E)
+ t)" and is \valid
/
is zero for
for all
real t.

If a is not a positive integer, the right side of (7) is an infinite series which is convergent
only for -1 c t < 1.
Now we first note that

Inthisrelationputk=j-r, sothat k = 0 , 1 , 2 . . . . and we have

writing the terms in the numerator in the reverse order.

We shall use this result in the proof of the following theorem.

Theorem 3 : The sum of the negative binomial probabilities f(j; r, p) is one, i.e.

Proof :Write q = 1 - p and j - r = k. Then using (8) we get

Blqise Pascal (162&1662)

= - q)-, using (7)


= 1, since .I - q = p.
The above discussion also brings out the fact that the negative binomial probabilities
fa; r, p), j 2 are terms of the binomial expansion of pr (I-q)-', which has a negative
exponent, (4) is for this reason that the probability distribution specified by (4) is
It .
called the negative binomial distribution. It is also known as the Pascal distribution.
probability on Discrete Sample In the following examples you will see some situations where the r.v. has negative binomial .
Spaces distribution.

Example 3:.A proof-reader catches a misprint with probability 0.60. Let us find the
I probability that a total of ten misprints have occurred before our proof-reader catches his
third misprint.
If our proof-reader catches a misprint, we'll term it a success! Here we want to find the
probability that the third success occurs at the tenth trial, when p = 0.60. Hence with r = 3,
and j = 10, the required probability is

Example 4: The probabilities of having a male or a female child are both 0.50. Can you find
the probability that a family's fourth child is their second daughter?
,
Let us term the birth of daughter a success.
We have p = 112, and we need

In the following discussion we evaluate the mean and variance of the negative binomial
distribution with parameters (r, p).
Notice that the number X of trials required for the rth success is the sum of r r.vs.,
Y I , Y2, . . ., Y,, where Y, is the number of trials required for the first success, Y2 is the
number of trials required, after the first success, to obtain the second success, and so on. In
genera!, Yj is the number of trials between the (j - 1)th and j-th success. Do you agree that
Y,.Y2, . . . , Y, are independent r.vs. and that each has the geometric distribution with the
same parameter p?
It follows from Theorem I, that
1
= - , Var (yj) =
E(Y~) a-

P p2
Hence,
r
E ( X ) = E ( Y , + Y 2 + . ; . + Y r ) = - . and
P
Var (X) = Var ( Y I + . . .+ Y,)

Caution :The above discussion only indicates a method of derivation of E(X) and Var(X),
and is not a formal proof of ( 10) and ( 1 I )
We are sure you will be able to solve the following exercises on the basis of our discussion
in this section.

E3) Find the probability that a person tossing an unbiased coin gets fourth head on
seventh toss.
E4) Find the probability that a person rolling an unbiased die, gets his third six on the
eighth roll.
I
E5) A scientist innoculates several mice, one at a time, with a virus which produces a I
disease in them. If each mouse has probability 1/4 of developing the disease, find the
expected number of mice required for an experiment in which the scientist stops after
obtaining the second mouse with the disease.
6 f i r n n i n t ~the mnmpnt opnprstino filnrtinn nf the negative hinnmial di~trihution
E7) Let X and Y be two independent r.vs. with negative binomial distributions and Standard ProbaMilty
parameters (r, p) and (s, p), respectively. Find the m.g.f. of X + Y. Dlstrlbutlons :Part ll

So far, we have seen that the geometric distribution can be applied to situations where we
are interested in the number of trials needed for the first success. On the other hand, the
negative binomial distribution applies to situations in which our interest lies in the number
of trials required for r successes, where r is a positive integer. So what happens if we take
r = 1 in the negative binomial distribution? We get the geometric distribution, of course.
In the next section we take up one last discrete probability.distribution-the Poisson
distribution.

9.4 THE POISSON DISTRIBUTION

We describe below three real-life situations from three different areas. The first case is from
meteorology in which we are concerned with the frequency with which rain storms occur.
The second case is related to frequency of wrong telephone connections and the third is.
related to bacterial counts in different areas of dish called the Petri plate which biologists
use. We shall then describe their common features. These can be used to develop a
probability distribution, called the Poisson distribution, in honour of the French
mathematician Simeon D. Poisson (178 1- 1840) who studied it for the first time.
Case 1 :The table below is based on the records of 10 rainfall stations over a period of 33
years. Thus we have records for 10 x 33 = 330 hation-Years.This table gives the number of Sirnoon b. Poisson (1781-1840)
rainstorms, i.e. the number of 10 minute periods with more than 1 cm. of rain.
Table 1 :Rainstorms

( Frequency 1 102 114 74 28 ' 10 2

Source :E.L. Grant (1964), Statistical Quality Control.


Here x is the number of rainstorms in a station-year and the corresponding frequency is the
number of station-years with x rainstorms.
Case 2 :Table 2 shows the frequency distribution of telephone connections to a wrong
number. A total of 267 telephones were observed.
Table 2: Connections to wrong numbers
X X 4 5 1 6 7 8 9 1 10 I
( Frequency 1 1 5 1 11 1 14 1 2 2 4 3 1 31 1 40 1 35 1

X . 11 12 13 14 15 More than 16 The concept of a station-year is


Frequency 20 18 12 7 6 2 similar to that of man-hour. If three
men work for 8 hours each, we say
Source: W.Feiler (1972), An Introductlon to Probabiilty Theory and its Appllcatlons, Vol. I. that they have worked for 3x8 = 24
Here x is the number of wrong telephone connections and the frequency gives the number of
telephones with x wrong connections.
Case 3 :Bacterial colonies develop over the surface of a Petri plate. The plate is divided
into a large number of small squares of equal area and observed under a microscope. The
bacterial colonies are visible as dark spots. The following table gives the observed number
(frequency) of squares with exactly x dark spots.
Table 3 :Bacterial Counts
x 0 1 2 .3 4 5 6 or more
Frequency 5 19 26 26 21 13 8
Source :W. Feller ('1972).An Introduction to Probability Theory and its Applications, Vol. 1.
On the face of it, there is very little similarity between these three cases. However, notice
that in each case we have counted the number of times an event has occumd. The event
concerned-hasmany opportunities or trials when it could have occurred but it had a very
"
,
^
I
1 ,
,,L
-L:
l:&
. ,C--,..,,,, "* -:..-, d - 1 TI..."
Probability on Discrete Sample there are many 10 minute periods in a year, buit it is very unlikely that any specific
Spaces 10-minute interval would have a rainstorm.
there are many occasions when any one of the 267 telephones would be used but the
chance of a wrong connection can be expected to be small.
a Petri plate has a large number of small squares and it would be rare to find a bacterial
colony in a specified square.
In other words, we can think of a large number, n, of independent Bernoulli trials with a
small probability p of 'success' at each trial. Although n is large and p is small, we can
expect the mean number np of successes to be a finite number. Thus, we are interested in the
probability distribution of the number of successes in a large number n of independent
Bernoulli trials, each with the same small chance p of success such that np remains finite.
We know that the number of successes in n such independent trials follows a binomial
distribution. So, the probability b(r; n, p) of r successes in n independent Bernoulli trials
with constant probability p of success is

But we want to find what happens when n is large and p is small. That is, we want to find
the limit of b(r; n, p) as n + w and p + 0, such that np equals m, say, where m is a positive
number.
We can do this as follows :
We have p = m/n, and

em
. . . . [ l - (r - l)/n] converges to 1 as n +
5

The factor 1 (1 - l/n) .o. Moreover, the term


I \I

I--:I
since lim I =e

(Recall Unit 5, MTE-01).


( l-
( 1 - d n ) n - r = (1 -mm/n)r
/ " ) n+ C 1= e - m ,

as n + =, r being kept fixed. The conclusion is that


.e-m rnr
b(r; n, p) +- - . say
r ! - ~ ( rm),

There are two ways of looking at (12).


One is to treat p(r, m) as an approximation to b(r; n, p). In fact, we call p(r, m), the
Poisson approximation to b (r; n, p).
Another way is to regard

as the p.m.f. of a r.v. It is easy to verify that p(r, m) has all the qualifications to be a p.m.f.,.
since

In this case we give the following definition.


Definition 3 :A r.v. X is said to have Poissop distribution with parameter m > 0, if its
p.m.f. is
Now let us compare the probabilities obtained by applying the binomial and Poisson Standard Probability
distributions with the help of an example. Distributions : Part I1

Example 5 :There are few printing mistakes in the material printed at a good press. In fact,
the probability of a printing mistake is 0.01. Let us find the probability that in a text with
500 words, there are no mistakes.

Assuming that the conditions for binomial distribution hold, therequired probability is

Suppose we use the Poisson approximation for b(0; 500,0,0.1). Since n = 500, p = 0.01, we
may take m = np = 5. Heye,

Notice that the difference is only in the fourth place of decimal.

The following table gives the values of b(r, 500,0.01) for r = 0, 1 , 2 , 3 , 4 and those of the
corresponding Poisson approximations p(r, 5), for the same values of r.

Table 4 :Probability of r printing mistakes


r 0 1 2 3 4
Binomial distribution 0.0066 0.0335 0.0840 0.1408 0.1768
f(r; 500,0.01)
Poisson approximation 0.0067 0.0335 0.0838 0.1396 0.1745
p(r, 5)

You would notice that the Poisson approximation is quite satisfactory. In fact, it would
improve with'larger values of n and smaller values of p. Generally speaking, the Poisson
approximation to the binomial probabilities is satisfactory if n 2 20 and p S 0.05.
In calculating tbe above probabilities we have used the recurrence relation for binomial
probabilities. We have also used the following recurrence relation for the Poisson
probabilities.
We have

which is valid for all r = 0, 1.2, ...


Let us now obtain the mean and variance'of the Poisson distribution.
Theorem 4 :If the r.v. X has Poisson distribution with parameter m, then

Proof : We have by definition


Probability on Discrete Sample
Spaces Since z
w

r=O
mt
-= em, it
t!
follow^ that

The first step in the calculation of the variance is to compute

= m2 ,-"'
em = m2

Now recall that

Var (X) = E { X(X - 1) ) + E(X) - {E(x)]2


=m2+m-m2=m.
Thus, the results of the theorem are established. -,-

So, the mean and variance of the Poisson distribution are always equal.
The next theorem gives us the m.g.f.

Theorem 5 ,:The moment generating function of the Poisson distribution is

valid for all real t.

Proof
- .:We have
M,(t) = ~ [ e ~ ]

(me')'
r!

= expi- m + me')
= exp{m(e:- I ) ) ,
as required.

We can use this theorem to prove the additive property of variables with Poisson distribution.

Corollary :If XI and X2 are independent Poisson r.vs with parameters m, and m2,
respectively, then XI + X2 has Poisson distribution with parameter m l+ m2.

Proof :We first determine the -- ' '


PIXI + X2 = k]. The event X I + X2 = k is the union of the mutually exclusive events, Standard Probability
Distributions:Part 11
X 1 = 0 , X 2 = k , X 1= 1 , X 2 = k - 1,... ; X 1 = k , X 2 = 0 .

Therefore,

(k - j)!
j =0

This shows that the p.m.f. of X, + X2 is that of a Poisson r.v. with parameter (mI + m2) and
hence, X I + X2 has a Poisson distribution with parameter (ml + m2).

We can easily extend this result to more than two variables.

.
Corollary :If X I , X2, . . . X, are independent Poisson variates with parameters
m,, m2, . . . ,m,, respectively, then the r.v. X I + X2 + . . . + X, has Poisson distribution with
i
parameter m + m2 + . . . + m,.

We have seen that Poisson distribution gives a very good approximation of binomial
distribution. Poisson distribution can also arise in situations which have no direct connection
with the binomial distribution. But we shall not discuss such situations here.

See if you can solve these exercises now.

E8) Records show that the probability that a train has an accident between two specific
stations is 0.0004. Use the Poisson approximation to the binomial probabilities to
obtain the probability that in its 700 trips during the year, the train would have at most
one accident.
E9) It is known that the number of imperfections per metre of a certain variety of cloth is a
Poisson r.v. with m = 0.12. Find the probability that ten metres of this cloth will have
a) four imperfections
b) at most three imperfections.
Hint :Use the second corollary to Theorem 5, assuming that imperfections over
non-overlapping portions of the cloth are independent.
E10) a) Compute the mean, m, for the frequency distribution given i n ~ a b l e1.
b) Use this value of m to calculate the Poisson probabilities of 0, 1, . . . , 6 rain-
storms.
c) The product N x p(r, m), where N = 330 is the total number of observations, gives
the expected frequency based on the assumption that the number of rainstorms
occur according to Poisson distribution. Fill in the blanks in the following
table :
Table 5 :Number of Rainstorms

Observed 102 114 74 28 10


frequency
Expected
frequency I
Probability I Discrete Sample El 1) Fill in the blanks in the following tables on the assumptions tahat the variables have
Spaces Poisson distribution.
I a)
Table 6 : Number of wrong connections
1 x 1 Observed frequency 1 Expected frequency 1

I
Total 267

b)
Table 7 :Counts of bacteria

frequency
Expected
frequency 1

When you have done El0 and El 1, you will find that the agreement between observed and
expected frequencies is quite good in all the three cases. You will study the methods of
comparing the observed and expected frequencies in more detail in Block 4 under the topic
"chi-square tests of goodness of fit".
Now let us summarise what we have done in this unit.

9.5 SUMMARY

In this unit we have covered the following main points :


1) The geometric distribution and the negative binomial distribution are two examples of
waiting time distributions. They are the distributions of the number of trials required
for the first and the r-th success in independent repetitions of Bernoulli trials. Thus,
the geometric distribution is a particular case (when r = 1) of the negative binomial
distribution. Moreover, the negative binomial distribution can be regarded as the
distribution of the sum of r independent and identically distributed geometric r.vs.
with parameter p.
2) The Poisson distribution is in a different class. It can be regarded as the limiting form
of a binomial distribution obtained by allowing n + = and p + 0 such that np is
finite. This approach enables us to compute approximately the binomial probabilities.
3) The negative binomial and the Poisson distribution also possess the so-called
reproductive property :
If X I and X2 are independent r.vs having negative binomial distributions
then X , + X2 also has the negatlve binomial (Poisson) distributioniwith parameters
(rl + r2' P) (m, + m2).
The standard distributions which we described in Units 8 and 9 are not the only discrete
distributions. There are many others with interesting properties which we hope you would
feel inclined to study in the future.

1 9.6 SOLUTIONS AND ANSWERS -

E l ) We need to obtain P[X 2 61 when X has the geometric distribution with p = 1/6. The
required probability is

E2) The required probability is


1/13 + (1113) (12113) + (1113) (12/13)~+ (1/13) (12/13Q -=0.269.
E3) We need
f (7; 4, 112) = 0.1556.

E4) The required probability' is


f(8; 3, 116) = 0.039.
E5) We need E(X) when X has negative binomial distribution with r = 2, p = 114. The
answer is E(X) = 8.

E6) If X has the negative binomid distribution with parameters rand p, its m.g.f. is

-
-A??!??
{I - qetr '
i
Whereq= 1 -pand t <In{(l -p)-l).
E7) The m.g.f. of X + Y is

provided t < In{(l - p)-l).


E8) We have m = 0.0004 x 700 = 0.28 and we need

.E9) In view of the corollary, the number X of imperfections in ten metres of the cloth has
Poisson distribution with
Robability on Discrete Sample
sw= Num~erralnstoms 0 1 2 3 4 5 6
Robability 0.3 0.36 0.21 0.088 0.027 0.006 0.003

Number of Rainstorms

Expected 29
frequency

E l l ) a)
Number of wrong connections
x Observed frequency Expected frequency
0-2 1

10.39

Total 267 267.00

Counts of bacteria

Observed
frequency
Expected
frequency

You might also like