0% found this document useful (0 votes)
16 views14 pages

Conditional Probability

v imp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views14 pages

Conditional Probability

v imp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Probability

Introduction

Probability is defined as the likelihood of an event to occur. It is


measured as the number of favourable events to occur from the total
number of events. It is to be noted that the probability of an event is
always 0 ≤ P(x) ≤ 1.

Conditional probability
If E and F are two events associated with the same sample space of a
random experiment, the conditional probability of the event E given that
F has occurred, i.e. P (E|F) is given by

P(E|F) = P(E ∩ F)/P(F), provided P(F) ≠ 0

Properties of conditional probability


Property 1:
P(S|F) = P(F|F) = 1

Property 2:
If A and B are any two events of a sample space S and F is an event
of S such that P(F) is not equal to zero, then
P((A ∪ B)|F) = P(A|F) + P(B|F) – P((A ∩ B)|F)

Property 3:
P(E′|F) = 1 − P(E|F)

Example:

A family has two children. What is the probability that both


the children are ways given that at least one of them

Solution

Let b stand for boy and g for girl. The sample space of the

experiment is

S = {(b, b), (g, b), (b, g), (g, g)}

Let E and F denote the following events :

E : ‘both the children are boys’

F : ‘at least one of the child is a boy’

Then E = {(b,b)} and F = {(b,b), (g,b), (b,g)}

Now

E∩ F = {(b,. b)}

Thus. P(F)=3/4 and P(E∩F) =1/4

Therefore P(ElF) = (1/3)/(3/4) = 1/3

Multiplication Theorem on Probability


.
Let E and F be two events associated with a sample space S.
Clearly, the set E ∩ F denotes the event that both E and F have
occurred. In other words, E ∩ F denotes the simultaneous
occurrence of the events E and F.
The event E ∩ F is also written as EF. According to this rule, if E
and F are the events in a sample space, then;
P(E ∩ F) = P(E) P(F|E) = P(F) P(E|F)
where P(E) ≠ 0 and P(F) ≠ 0

The event E ∩ F is also written as EF. According to this rule, if E


and F are the events in a sample space, then;
P(E ∩ F) = P(E) P(F|E) = P(F) P(E|F)
where P(E) ≠ 0 and P(F) ≠ 0

for more than two events

P(E ∩ F ∩ G) = P(E) P(F|E) P(G|(E ∩ F)) = P(E) P(F|E) P(G|EF)

Similarly, the multiplication rule of probability can be extended for

four or more events.

Example:

Three 'cards are drawn successively, without replacement from a pack of 52

well shuffled cards. What is the probability that first two cards are kings and

the third card drawn is an ace?

Solution
Let K denote the event that the card drawn is king and A be the event that

the card drawn is an ace. Clearly, we have to find P (KKA)


Now P(K) =
_
4
52

Also, P(K|K) is the probability of second king with the condition that one king

_
has already been drawn. Now there are three kings in (52 − 1) = 51 cards.
3
Therefore P(K|K) =
51
Lastly, P(A|KK) is the probability of third drawn card to be an ace, with the

condition that two kings have already been drawn. Now there are four aces in

left 50 cards.

Therefore P(A|KK) =
_
4
50

___
By multiplication law of probability, we have P(KKA) = P(K) P(K|K) P(A|KK)

4 3 4 2
X X =
52 51 50 5525

Independent Events
Let E and F be two events associated with the same random experiment, then E

and F are said to be independent if

P(E ∩ F) = P(E) . P (F)

Partition of a sample space


A set of events E1 , E2 , ..., En is said to represent a partition of the sample

space S if

(a) Ei ∩ Ej = φ, i ≠ j, i, j = 1, 2, 3, ..., n
(b) E1 ∪ Ε2 ∪ ... ∪ En = S and

(c) P(Ei ) > 0 for all i = 1, 2, ..., n.

In other words, the events E1, E2, ..., En represent a partition of the sample

space S if they are pairwise disjoint, exhaustive and have nonzero probabilities.

Theorem of total probability


Let {E1 , E2 ,...,En } be a partition of the sample space S, and suppose that each

of the events E1 , E2 ,..., En has nonzero probability of occurrence. Let A be any

event associated with S, then

P(A) = P(E1 ) P(A|E1 ) + P(E2 ) P(A|E2 ) + ... + P(En ) P(A|En )

Example:
A person has undertaken a construction job. The probabilities are 0.65 that

there will be strike, 0.80 that the construction job will be completed on time if

there is no strike, and 0.32 that the construction job will be completed on time

if there is a strike. Determine the probability that the construction job will be

completed on time.

Solution

Let A be the event that the construction job will be completed on time, and B

be the event that there will be a strike. We have to find P(A).

We have

P(B) = 0.65, P(no strike) = P(B′) = 1 − P(B) = 1 − 0.65 = 0.35

P(A|B) = 0.32, P(A|B′) = 0.80


Since events B and B′ form a partition of the sample space S, therefore, by

theorem on total probability, we have

P(A) = P(B) P(A|B) + P(B′) P(A|B′)

= 0.65 × 0.32 + 0.35 × 0.8

= 0.208 + 0.28 = 0.488

Thus, the probability that the construction job will be completed in time is

0.488.

Bayes’ Theorem
If E1 , E2 ,..., En are n non empty events which constitute a partition of sample

space S, i.e. E1 , E2 ,..., En are pairwise disjoint and E1 ∪ E2 ∪ ... ∪ En = S and

A is any event of nonzero probability, then

P(Ei |A) = P(E i )P(A|Ej ) for any i = 1, 2, 3, ..., n


n
P(E j)P(A|Ej )
j=1

Proof :
By formula of conditional probability, we know that
P(A ∩ Ei )
P(Ei |A) =
P(A)
P(Ei )P(A|E i )
= (by multiplication rule of probability)
P(A)
P(Ei )P(A|E i )
= n (by the result of theorem of total
P(Ej )P(A|E j ) probability)
j=1
Remark

The following terminology is generally used when Bayes' theorem is applied.

The events E1 , E2 , ..., En are called hypotheses.

The probability P(Ei ) is called the priori probability of the hypothesis Ei

The conditional probability P(Ei |A) is called a posteriori probability of the

hypothesis Ei .

Example:
Bag I contains 3 red and 4 black balls while another Bag II contains 5 red

and 6 black balls. One ball is drawn at random from one of the bags and it is

found to be red. Find the probability that it was drawn from Bag II.

Solution

Let E1 be the event of choosing the bag I, E2 the event of choosing the bag II

and A be the event of drawing a red ball.

Then P(E 1 ) = P(E 2) = 1/2

Also P(A|E1 ) = P(drawing a red ball from Bag I) =3/2

and P(A|E2 ) = P(drawing a red ball from Bag II) = 5/11

Now, the probability of drawing a ball from Bag II, being given that it is red, is

P(E2 |A)
By using Bayes' theorem, we have
P(E2)P(A|E2)
P(E2|A) =
P(E 1)P(A|E1 )+P(E2)P(A|E 2)

(1/2) X (5/11)
=
(1/2) X (3/7) + (1/2) X(5/11)

= 35/68

Example:
Suppose that the reliability of a HIV test is specified as follows: Of people

having HIV, 90% of the test detect the disease but 10% go undetected. Of

people free of HIV, 99% of the test are judged HIV–ive but 1% are diagnosed as

showing HIV+ive. From a large population of which only 0.1% have HIV, one

person is selected at random, given the HIV test, and the pathologist reports

him/her as HIV+ive. What is the probability that the person actually has HIV?

Solution
Let E denote the event that the person selected is actually having HIV and A the

event that the person's HIV test is diagnosed as +ive. We need to find P(E|A).

Also E′ denotes the event that the person selected is actually not having HIV.

Clearly, {E, E′} is a partition of the sample space of all people in the population.

We are given that

P(E) = 0.1% =0.001


P(E′) = 1 – P(E) = 0.999
P(A|E) = P(Person tested as HIV+ive given that he/she is actually having HIV)
= 90% = 0.09
P(A|E′) = P(Person tested as HIV +ive given that he/she is actually not having HIV)

= 1% = 0.01

Now, by Bayes' theorem

P(E )P(A|E )
P(E |A) =
P(E )P(A|E )+P(E’)P(A|E’)

0.001 × 0.9
=
0.001 × 0.9 + 0.999 × 0.01

= 0.083 approx.

Thus, the probability that a person selected at random is actually having HIV

given that he/she is tested HIV+ive is 0.083.

Random Variable
A random variable is a real valued function whose domain is the sample space of

a random experiment.

For example, let us consider the experiment of tossing a coin two times in

succession.

The sample space of the experiment is S = {HH, HT, TH, TT}.

If X denotes the number of heads obtained, then X is a random variable and for

each outcome, its value is as given below :

X(HH) = 2, X (HT) = 1, X (TH) = 1, X (TT) = 0.


More than one random variables can be defined on the same sample space. For

example, let Y denote the number of heads minus the number of tails for each

outcome of the above sample space S. Then Y(HH) = 2, Y (HT) = 0, Y (TH) = 0, Y

(TT) = – 2. Thus, X and Y are two different random variables defined on the

same sample space S.

Probability Distribution of a Random Variable


The Probability Distribution of a random variable X for the system of numbers is

defined as follows:

X: x1 x2 x3............xn

P(X): p1 p2 p3....... pn

n
pi =1
i=1

Where,

pi > 0, and i= 1, 2, 3, …, n.

The real numbers x1, x2, x3,…xn are the possible values of the random variable X,

and p1, p2, p3, …pn are the probabilities of the random variable X that takes the

value xi.

Therefore, P(X = xi) = pi.

(Note: The sum of all the probabilities in the probability distribution should be

equal to 1)
Mean of a Random Variable
If X is a random variable, and its possible values are x1, x2, x3,…xn associated with

the probabilities p1, p2, p3, ..pn, respectively, then the mean of the random

variable X is given by the formula:


n
𝐸(𝑋)=𝜇=∑𝑥𝑖𝑝𝑖
i=1

The mean of the random variable (µ) is also called the expectation of the

random variable E(X).

The mean of the random variable X can also be represented by

E(x) = x1p1+x2p2+x3p3+…..+xnpn

Thus, the mean or the expectation of the random variable X is defined as the

sum of the products of all possible values of X by their respective probability

values.

You might also like