0% found this document useful (0 votes)
36 views

Unit 7: Probability Theory

This document introduces key concepts in probability theory. It defines probability as a measure of uncertainty about whether an event will occur. Probability is quantified as a number between 0 and 1, where 0 means the event is impossible and 1 means it is certain. Events are defined as subsets of a sample space, which represents all possible outcomes of an experiment. Common operations on events like unions, intersections, and complements are discussed. The document also introduces terminology like experiments, outcomes, sample spaces, and sigma fields, which are mathematical structures used to define measurable events for assigning probabilities.

Uploaded by

Jackson Reyes
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Unit 7: Probability Theory

This document introduces key concepts in probability theory. It defines probability as a measure of uncertainty about whether an event will occur. Probability is quantified as a number between 0 and 1, where 0 means the event is impossible and 1 means it is certain. Events are defined as subsets of a sample space, which represents all possible outcomes of an experiment. Common operations on events like unions, intersections, and complements are discussed. The document also introduces terminology like experiments, outcomes, sample spaces, and sigma fields, which are mathematical structures used to define measurable events for assigning probabilities.

Uploaded by

Jackson Reyes
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

UNIT 7: PROBABILITY THEORY

7.1. Introduction.

The origin of probability theory lies in physical observations associated with


games of chance. It was found that if an “unbiased” coin is tossed independently n
times, where n is very large, the relative frequency of heads, that is, the ratio of
the number of heads to the total number of tosses, is very likely to be very close to
½. Similarly, if a card is drawn from a perfectly shuffled deck and then is replaced,
the deck is reshuffled, and the process is repeated over and over again, there is (in
some sense) convergence of the relative frequency of spades to ¼.
Ancient accounts suggest that the probability theory goes back to primitive
forms of gambling and gambling games. Girolamo Cardan (1501-1576) said that
nearly 2000 years ago Roman soldiers invented many of our current gaming only as
a pastime during their campaigns to conquer most of the civilized world. Other
authors claim that the theory of probability has its origins in the 16th century in
France by gambling and is due to the curiosity of the players that beset with
questions to his friends in the world of Mathematics (correspondence between
Pascal and Fermat). In century XVII Jacob Bernouilli, Member of a Swiss family of
mathematicians, established many of the basic laws of modern probability. Thomas
Bayes (1702-1761) and Joseph Lagrange are also counted among the pioneers of
the theory of probability.
The so called “classical definition” is due to Laplace who in his book "Théorie
analitique des probabilités" (1812), establishes the definition of probability of an
event, as the number of outcomes favorable to the event, divided by the total
number of outcomes, where all outcomes are equally likely. This definition is first of
all restrictive (it considers only experiments with a finite number of outcomes) and,
more seriously, circular (no matter how you look at it, “equally likely” means
“equally probable”, and thus we are using the concept of probability to define
probability itself). Highlight on the other hand the axiomatic calculus of probabilities
to Kolmogorov (1933).
Today we see that the importance of the theory of probability is high on many
matters of business. Insurance and actuarial practices have a strong foundation in
principles of the theory of probability. For example, life insurance premiums depend
on mortality tables, which in turn are based on the probability of death at a specific
age. The probability also applies to the estimate of the number of defective units in
the manufacturing processes, the likelihood of receiving payments in the accounts
receivable and sales potential...
Well, the theory of probability is now a part of mathematics, analogous to the
algebra or geometry and its construction will therefore be similar. For the
construction of a mathematical theory is part of a set of assertions, which are
designated by the name of axioms, and a succession of statements that are
designated by the name of theorems are deduced by logic. The way in which the
axioms are chosen is trying to idealize the reality. The phenomena that can be
studied and which are associated with the realization of any experiment (well-
defined action that produces a unique and well-defined result) may be a varied
typology, but a simple classification of them, which will also be of great interest for
statistics is that which differentiates between deterministic phenomena and
random.
Probability measures the amount of uncertainty of an event: a fact whose
occurrence is uncertain.
Consider, as an example, the event R “Tomorrow, January 16th, it will rain in
Cartagena”. The occurrence of R is difficult to predict — we have all been victims of
wrong forecasts made by the “weather channel” — and we quantify this uncertainty
with a number P(R), called the probability of R. It is common to assume that this
number is non-negative and it cannot exceed 1. The two extremes are interpreted
as the probability of the impossible event: P(R) = 0, and the probability of the sure
event: P(R) = 1. Thus, P(R) = 0 asserts that the event R will not occur while, on
the other hand, P(R) = 1 asserts that R will occur with certainty.
JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER
FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
1-11
UNIT 7: PROBABILITY THEORY

Suppose now that you are asked to quote the probability of R, and your answer is
P(R) = 0,7.
There are two main interpretations of this number. The ratio 0,7 represent the odds
in favor of R. This is the subjective probability that measures your personal belief in
R. Objective probability is the interpretation of P(R) = 0,7 as a relative frequency.
Suppose, for instance, that in the last ten years, it rained 7 times on the day 16th
January. Then 0,7 = 7/10 is the relative frequency of occurrences of R, also given
by the ratio between the favorable cases (7) and all possible cases (10).
There are other interpretations of P(R) = 0,7 arising, for instance, from logic or
psychology.

7.2 Terminology for probability theory:

_ Experiment: process of observation or measurement; e.g., coin flip;

_ Outcome: result obtained through an experiment; e.g., coin shows tails;

_ Sample space: set of all possible outcomes of an experiment; e.g., sample


space for coin flip: Ω = {H,T}.

In general, the only physical requirement on Ω is that a given performance of the


experiment must produce a result corresponding to exactly one of the points of Ω.
We have as yet no mathematical requirements on Ω; it’s simply a set of points.

Sample spaces can be finite or infinite.

Example: Finite Sample Space

Roll two dice, each with numbers 1–6. Sample space:

S1 = {(x, y) / x Є {1,2,…,6}, y Є {1,2,…,6}}

Alternative sample space for this experiment – sum of the dice:

S2 ={x+ y / x Є{1,2,…,6}, y Є{1,2,…,6}}={z / z Є{2,…,12}}

Example: Infinite Sample Space

Flip a coin until heads appears for the first time:

S3 = {H; TH; TTH; TTTH; TTTTH,...}

Next we come to the notion of event. An event associated with a random


experiment corresponds to a question about the experiment that has a yes or no
answer, and this in turn is associated with a subset of the sample space: Often we
are not interested in individual outcomes, but in events. An event is a subset of a
sample space.

Example

With respect to S1, describe the event B of rolling a total of 7 with the two dice.

B = {(1; 6), (2,5), (3,4), (4,3), (5,2), (6,1)}

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
2-11
UNIT 7: PROBABILITY THEORY

Often we are interested in combinations of two or more events.

This can be represented using set theoretic operations.

Assume a sample space Ω and two events A and B:

C
_ Complement A (also A = A ): all elements of Ω that are not in A;

A ≡ A C ={ω∈Ω / ω∉A }

_ Subset ACB: all elements of A are also elements of B;

_ Union A U B: all elements of Ω that are in A or B;

A U B ={ω∈Ω / ω∈A u ω∈ B }

_ Intersection A ∩ B: all elements of Ω that are in A and B;

A ∩ B ={ω∈Ω / ω∈A & ω∈ B }

These operations can be represented graphically using Venn diagrams:

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
3-11
UNIT 7: PROBABILITY THEORY

There are two singular events:

The Impossible event (it never occurs); ∅


And the sure event that ever occurs; P(Ω)=1.

DeMorgan Laws shows the relationship between these operations:

* ( A ∪ B ) C = AC ∩ B C

* ( A ∩ B )C = AC ∪ B C

_ Difference between two events A & B (A-B): A occurs and B does not
C
A - B ={ω∈Ω / ω∈A y ω∉ B }= A ∩ B

A–B
A B

_ Mutually exclusive (disjoint) events: A ∩ B = ∅;

A B

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
4-11
UNIT 7: PROBABILITY THEORY

7.3 Sigma Field

We now consider the assignment of probabilities to events. A technical


complication arises here. It may not always possible to regard all subsets of Ω as
events. We may discard or fail to measure some of the information in the outcome
corresponding to the point ω Є Ω, so for a given subset A of Ω, it may not possible
to give a yes or no answer to the question “Is ω Є A?”. For example, if the
experiment involves tossing a coin five times, we may record the results of only the
first three tosses, so that A= (at least four heads) will not be “measurable”; that is,
membership of ω Є A cannot be determined from the given information about ω.
Many times we are not interested in every event of an experiment, but only
in a part of them. In fact, two players playing dices interested in the outcomes odd
and even, are not interested in the outcome “3”, but only odd or even.
In a given problem there will be a particular class of subsets of Ω call “class
of events”. For reasons of mathematical consistency we require that the event class
“S” form a sigma field, which is a collection of subsets of Ω satisfying the
following three requirements.
a) Ω Є S
b) S is closed under complementation: A∈ S ⇒ A ∈ S.
C

c) S is closed under finite or countable union:


{Ai } i=1,2,…,∞ con Ai∈ S ⇒ U Ai∈ S
We are now ready to talk about the assignment of probabilities to events. If
A∈ S, the probability P(A) should somehow reflect the long-run relative frequency
of A in a al large number of independent repetitions of the experiment. Thus P(A)
should be a number between 0 and 1, and P(Ω)=1.
Now if A and B are disjoint events, the number of occurrences of AUB in n
performances of the experiment is obtained by adding the number of occurrences
of A to the number of occurrences of B. Thus we should have

P(AUB)=P(A)+P(B) if A and B are disjoint


And similarly,

P(A1UA2U...UAn)= P(A1) +P(A2)...P(An)=Σ P(Ai) i=1,2,...,n if Ai are disjoint

For mathematical convenience we require that:

∞  ∞
If Ai are disjoint { Ai } i∞=1 Ai ∈ S ∀i; Ai ∩ A j = ∅ ∀i ≠ j ⇒ P  Ai  = ∑ P( Ai )
 i =1  i =1

7.4. Axiomatic definition of Probability.

This definition is due to the Russian mathematician A.N. Kolmogorov.


A function that assigns a number P(A) to each set A in the sigma field S is called
probability measure on S, provided that the following conditions (Kolmogorov
axioms) are satisfied.

i) P(A) ≥ 0 ∀ A ∈ S
ii) P( Ω ) = 1
iii) If Ai are disjoint sets in S, then:
∞  ∞
{ Ai } i∞=1 Ai ∈ S ∀i; Ai ∩ A j = ∅ ∀i ≠ j ⇒ P  Ai  = ∑ P( Ai )
 i =1  i =1
7.5. Probability space.

A probability space is a triple (Ω, S, P), where Ω is a set, S a sigma field of


subsets of Ω, and P a probability measure on S.
JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER
FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
5-11
UNIT 7: PROBABILITY THEORY

7.6. Elemental theorems or consequences of the axioms.

We shall not, at this point, embark on a general study of probability


measures. However, we shall establish four facts from the definition (all sets in the
arguments to follow are assumed to belong to S).

Theorem 1.- An event that never occurs (the impossible event) has
probability 0: P( ∅ ) = 0.

Proof.
If we consider the set of events: { Ai } i =1 Ai ∈ S Ai = ∅ ∀i

By iii),
∞  ∞  ∞ ∞
But: P  Ai  = P  ∅ = P(∅) = ∑ P( Ai ) = ∑ P(∅)
 i =1   i =1  i =1 i =1
And consequently:

P ( ∅) = ∑ P ( ∅)
i =1
by i) the unique solution is P( ∅ ) = 0.

Theorem 2.- If A, B ∈ S y A ∩ B = ∅ ⇒ P( A ∪ B) = P(A) + P(B)

Proof.

If we consider the following collection of sets:


A1 = A; A2 = B; Ai = ∅ ∀ i > 2 .
This collection verifies the conditions of axiom iii), so
∞  ∞
P  Ai  = ∑ P( Ai )
 i =1  i =1
And we can apply the theorem 1:
P(A 1 ∪ A 2 ) = P(A 1 ) + P(A 2 )

Generalization: Let be Ai , i= 1,2,...,n ; Ai S i; an A i ∩ A j = ∅ ∀ i ≠ j , then:

n  n
P  A i  = ∑ P ( A i )
 i =1  i = 1

Theorem 3.- Let be A, B S, A ⊆ B ⇒ P(A ) ≤ P( B) .

Proof.
As , we can think of B = A ∪ (B-A). Then A y B-A are disjoint, and applying
theorem 2 we obtain:

P(B) = P(A) + P(B-A)

On the other hand, by i) P(B-A) ≥ 0, so:

P(A) ≤ P(B)

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
6-11
UNIT 7: PROBABILITY THEORY

Intuitively, if the occurrence of A always implies the occurrence of B, B must


occur at least as often as A in any sequence of performances of the
experiment.

Theorem 4.- Let be A, B S, then P( A ∪ B) = P(A)+P(B) - P(A ∩ B)

Proof.

We can write: A ∪ B=( A − B) ∪ ( B − A ) ∪ ( A ∩ B)


Where these three events are disjoint
A = (A − B) ∪ (A ∩ B)
B = ( B − A ) ∪ (A ∩ B)
We can apply theorem 2, obtaining: P(A ∪ B) = P (A − B) + P( B − A ) + P(A ∩ B) (1)

And we know (2):


P(A ) = P (A − B) + P(A ∩ B)
P( B) = P( B − A ) + P(A ∩ B)

By replacing (2) in (1) we obtain: P( A ∪ B) = P(A)+P(B) - P(A ∩ B)

Theorem 5.- Let be A1 , A2,,..., An ∈ S. Then:


 n  n  n 
P  Ai  = ∑ P( Ai ) − ∑ P( Ak1 ∩ Ak 2 ) + ∑ P(Ak1 ∩ Ak 2 ∩ Ak 3 ) + ... + (−1) n +1 P  Ai 
 i =1  i =1 k 1< k 2 k 1< k 2< k 3  i =1 

_
Theorem 6.- Let be A S. Then P(A) = 1- P( A ).

_
Proof.: A y are disjoint and A ∪ A = Ω , so applying theorem 2:

_ _ _
P(A ∪ A ) = P(A ) + P(A ) = P(Ω) = (by ii)= 1. So P(A)=1- P( A ).

Remarks: The basic difficulty with the classical and frequency definitions of
probability is that their approach is to try somehow to prove mathematically that,
for example, the probability of picking a heart form a perfectly shuffled deck is ¼,
or that the probability of an unbiased coin coming up heads is ½. This cannot be
done. All we can say is that if a card is picked at random and then replaced, and
the process is repeated over and over again, the result that ratio of hearts to total
number of drawings will be close to ¼ is in accord with our intuition and our
physical experience. For this reason we should assign a probability of ¼ to the
event obtaining a heart, and similarly we should assign a probability of 1/52 to
each possible outcome of the experiment. The only reason for doing this is that the
consequences agree with our experience. If you decide that some mysterious
caused the ace of spades to be more likely than any other card, you could
incorporate this factor bay assigning a higher probability to the ace of spades. The
mathematical development of the theory would not be affected; however, the
conclusions you might draw from this assumption would be at variance with
experiment results.
In probability theory we are faced with situations in which our intuition or some
physical experiments we have carried out suggest certain results. Intuition and
experience lead us to an assignment of probabilities to events. As far as the
mathematics is concerned, any assignment of probabilities will do, subject to rules
of mathematic consistency. However, our hope is to develop mathematical results
that, when interpreted and related to physical experience, will help to make precise
JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER
FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
7-11
UNIT 7: PROBABILITY THEORY

such notions as “the ratio of the number of heads to the total number of
observations in a very large number of independent tosses of an unbiased coin is
very likely to be very close to ½”.

7.6. Independence.

Consider the following experiment. A person is selected at random and his height is
recorded. After this the last digit of the license number of the next car to pass is
noted. If A is the event that the height is over 6 feet, and B is the event that the
digit is > 7, then, intuitively, A and B are “independent” in the sense that
knowledge about the occurrence or nonoccurrence of one of the events should not
influence the odds about the other. For example, say that P(A)=0,2 and P(B)=0,3.
In a long sequence of trials we would expect the following situation.

(Roughly) 20% of the time A 80% of the time A does not occur;
Occurs; of those cases in which A of these cases:
Occurs:
30% B occurs 30% B occurs
70% B does not occur 70% B does not occur

Thus, if B is independent of A, it appears that P(A∩B) should be


0,2(0,3)=0,6=P(A)P(B), and P(Ac∩B) should be 0,8(0,3)=0,24= P(Ac)P(B)
Conversely, if P(A∩B)=P(A)P(B)=0,6 and P(Ac∩B)= P(Ac)P(B)=0,24, then A occurs
roughly 20% of the time and we look at only the cases in which A occurs, B must
occur in roughly 30% of these cases in order to have A∩B occurs 6% of the time.
Similarly, if we look at the cases in which A does not occur (80%), then, since we
are assuming that Ac∩B occurs 24% of the time, we must have B occurring in 30%
of the cases.
Thus the odds about B are not changed by specifying the occurrence or non-
occurrence of A.
It appears that we should say that event B is independent of A if and only if
P(A∩B)= P(A)P(B) and P(Ac∩B)= P(Ac)P(B). However, the second condition is
already implied by the first:
If P(A∩B)= P(A)P(B), P(Ac∩B)=P(B-A)=P(B-(A∩B))=P(B)-P(A∩B)
Since A∩B is a subset of B; hence P(Ac∩B)=P(B)-P(A)P(B)=(1-P(A))P(B)= P(Ac)P(B)
Thus B is independent of A; that is, knowledge of A does not influence the odds
about B, if and only if P(A)P(B)= P(A∩B). But this condition is perfectly
symmetrical, in other words, B is independent of A if and only if A is independent of
B.
Thus we are led to the following definition:

Two events A and B are independent if and only if P(A∩B)= P(A)P(B)

Pr( i Ai ) = ∏ i Pr( Ai )
Generalization: A set of events {Ai} are independent if and only if:

7.7. Conditional probability.

If A and H are independent events, the occurrence or nonoccurrence of H


does not influence the odds concerning the occurrence of A. If A and H are not
independent, it would be desirable to have some way of measuring exactly how
much the occurrence of one of the events changes the odds about the other.
In a long sequence of independent repetitions of the experiment, P(H) measures
the fraction of the trials on which H occurs. If we look only at the trials on which H
occurs (say there nH of these) and record those trials on which A occurs also (there
are nAH of these, where nAH is the number of trials on which both A and H occur),
the ratio nAH/nH is a measure of P(A/H), the conditional probability of A given H,
that is, the fraction of the time that A occurs, looking only at trials producing an
JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER
FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
8-11
UNIT 7: PROBABILITY THEORY

occurrence of H. Comparing P(A/H) with P(A) will indicate the difference between
the odds about A when H is known to have occurred, and the odds about A before
any information about H is revealed.
The above discussion suggests that we define de Conditional Probability of A
given H as
P(A ∩ H)
P(A / H) =
P(H)
This makes sense if P(H)>0

Example: Throw two unbiased dice independently. Let H=(sum of faces=8),


A=(faces are equal). Then
P(A ∩ H)
P(A / H) = =1/5
P(H)

Consequences and theorems derived to the definition.

Theorem 7.- Let be A,B ∈ S , then if:

P(B) ≠ 0 ⇒ P(A ∩ B) = P( B) P(A / B)


P(A ) ≠ 0 ⇒ P(A ∩ B) = P(A ) P( B / A )

Generalization: Theorem 8.- (Multiplication rule)


j
P (  A i ) > 0 ; j = 1,2, . . . n − 1
Let be A1,A2,...,An ∈ S with i =1 . then:

 n   n −1 
P  A i  = P(A 1 ) P(A 2 / A 1 ) P(A 3 / A 1 ∩ A 2 )... P A n /  A i 
 i =1   i =1 

Theorem 9.- (Theorem of total probability).

Let be A1,A2,...,An ∈S verifying:

n
i) P(Ai ) > 0 i = 1,...,n ii) Ai =Ω iii) A i ∩ A j = ∅ ∀i ≠ j
i =1

i.e a finite or countable infinite family of mutually exclusive and


exhaustive events (are disjoint and their union is the sample space)
Let be A ∈ S any event. Then it is verified that:

n
P(A ) = ∑ P(A i ) P A A 
i =1
 i

Proof. We can write:


n

  n  (A ∩ A i )
A=A ∩Ω = (by ii))= A ∩   A i  = . i =1
 i =1 
Then:
n
n ∑ P(A ∩ A i ) n
P(A)=P(  (A ∩ A i ) )=(by iii)= i =1 =(Theorem 8)= ∑ P(A i ) P A A i  ,
i =1 i =1

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
9-11
UNIT 7: PROBABILITY THEORY

Notice that under above assumptions we have:

Bayes’ Theorem.
Let be A1,A2,...,An ∈ S with:

i) P(Ai ) > 0 i = 1,...,n

n
ii) Ai =Ω
i =1

iii) A i ∩ A j = ∅ ∀i ≠ j

i.e. finite or countable infinite family of mutually exclusive and


exhaustive events.

Let be B ∈ S / P(B) > 0. Then:

 
P ( A j ) P B A 
A   j
P j B  =
  n
 
∑ P(A j ) P B A j 
j =1

Proof.:
 
P ( A j ) P B 
 A j  P(A j ∩ B)  Aj 
P B  = =(multiplication rule)= = (Th. Prob. Total) =
 P( B) P(B)

 
P ( A j ) P B A 
 j
n
 
∑ P(A j ) P B A j 
j =1
This formula is sometimes called an “a posteriori probability”. The reason for
this terminology may be seen in the example below:

Example: Two coins are available, one unbiased and the other two-
headed. Choose a coin at random and toss it once; assume that the unbiased
coin is chosen with probability ¾. Given that the result is heads, find the
probability that the two-headed coin was chosen.

The “tree diagram” in the figure represents the experiment

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
10-11
UNIT 7: PROBABILITY THEORY

We may take to consist of the four possible paths through the tree, with
each path assigned a probability equal to the product of the probabilities
assigned to each branch. Notice that we are given the probabilities of the
events B1=(unbiased coin chosen) and B2=(two-headed coin chosen), as well
as the conditional probabilities P(A/Bi), where A=(coin comes up heads). This
is sufficient to determine the probabilities of all events.
Now we can compute P(B2/A) using Bayes’ Theorem; this is facilitated if,
instead of trying to identify the individual terms in the formula, we simply
look the tree and write:

P(B2/A)= P(B2∩A)/P(A)=P(two-headed coin chosen and coin comes up heads)


P(coin comes up heads)

(1/4)(1)
=-----------------------------------=2/5
(3/4)(1/2)+(1/4)(1)

There are many situations in which an experiment consists of a


sequence of steps, and the conditional probabilities of events happening at
step n+1, given outcomes at step n, are specified. In such cases a description
by means of a tree diagram may be very convenient.

Bibliography

* Mª Angeles palacios, Fernando A. López Hernández , José García Córdoba y


Manuel Ruiz Marín. “INTRODUCCIÓN A LA ESTADÍSTICA PARA LA EMPRESA”.
Librería Escarabajal

* Hermoso Gutiérrez, J. A. y Hernández Bastida, A. (1997). Curso Básico de


Estadística Descriptiva y Probabilidad. Ed. Némesis.

JOSÉ A. GARCÍA CÓRDOBA DEPARTMENT OF CUANTITATIVE METHDS AND COMPUTER


FACULTAD DE CIENCIAS DE LA EMPRESA
UNIVERSIDAD POLITÉCNICA DE CARTAGENA
11-11

You might also like