0% found this document useful (0 votes)
3 views

Probability

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Probability

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Probability

• Probability - is the chance that something will


happen
• In probability theory, an event is one or more of
possible outcomes of doing something.
• In probability theory, the activity that produces
an event is referred to as an experiment
• In tossing of a coin experiment, getting a head
is one event and getting a tail is another event.
Using this parlance, we ask,
• “In a coin-toss experiment what is
the probability of the event head?

• P(H) = 1/2
Sample Space
• The set of all possible outcomes of
an experiment is called a sample
space
Mutually Exclusive Event
• Events are mutually exclusive if only one can
take place at time.
• Consider again the example of the coin toss.
• There are two possible outcomes-getting the
head or getting the tail.
• On any single toss, either the head or tail will
turn up but not both.
• Thus, events head and tail on a single toss
are said to be mutually exclusive.
Crucial question we ask to determine
whether events are really mutually
exclusive is:
• Can two or more events happen or
occur together at one time?
• If the answer is yes, the events are
not mutually exclusive.
Three Types of Probability
• Classical approach
• Relative frequency approach
• Subjective approach
Classical approach or Priori
• -defines the probability that an event
will occur as

• P ( E) = number of outcomes
number of possible outcomes
Classical approach or Priori
• -defines the probability that an event will
occur as
• P ( E) = number of outcomes
number of possible outcomes
• P (H) = ½
• What is the probability of rolling a 5 on
one die?
• P (5)= 1/6
Relative Frequency
• -defines the probability as either:
1. The proportion of times that an event
occurs in the long run when conditions are
stable, or
2. The observed relative frequency of an
event in a very large number of trials
Example
The Registrar’s records show based past
data that about 50 of the 1,000 entering
freshmen usually leave school for
academic reasons by the end of the first
semester.
If Juan was admitted in the first year, what
is the probability that he will leave school
for academic reason by the end of the first
semester?
P(leaving school) = 50/1,000 or .05 or 5%
Addition Rule for Mutually Exclusive
Events

P (A or B) = P(A) + P(B)
Example
Years of
Experience Frequency Probability
0–2 5 .10
3–5 10 .20
6- 8 15 .30
More than 8 20 .40
50 1.0
What is the probability of 6 or more employees will get
the promotion?
P( 6 or more) = P( 6 to 8) + P(more than 8)
= .30 + .40
= .70
Addition Rule for Non-Mutually Exclusive Events

P (A or B) = P(A) + P (B) – P (A and B)


Example
Person Sex Age
1 M 31
2 M 33
3 F 46
4 F 29
5 M 41
P (F or more than 35)?
=P(F) + P (More than 35) – P(F and more than 35)
= 2/5 + 2/5 – 1/5
= 4/5 – 1/5
= 3/5 or .60 or 60%
Probabilities under statistical
independence
1. Marginal probability (unconditional) -is the
simple probability of occurrence of an event.
P(A) = P(A)
2. Joint probability- two or more events will occur
together or in succession.
P(AB) = P(A) x P(B)
3. Conditional probability-probability that a second
event (B) will occur given that a first event (A)
has already occurred.
P(B/A) = P (B)
Probabilities under statistical dependence
Statistical dependence exists when the probability
of some event is dependent upon or affected by
some other event.
1. Marginal probability of a statistical dependence
- is exactly the same as that of statistical
independent event. P(A) = P(A)
2. Joint probability under statistical dependence
P(AB)= P(A/B) x P(B)
Joint probability of events A & B happening together
or in succession is equal to probability of A given
that event B has already occurred times the
probability that event B will happen
3. Conditional probability under statistical
dependence
• Conditional probability will be treated first because
the concept of joint probability is best illustrated
using conditional probability as basis
P(A/B) = P(A)
Example:
• Assume that we have one urn containing 10 balls
distributed as follows:
Two separate categories
Color Pattern
Red dotted
Gray striped
3 are red and dotted
I is red and striped
2 are gray and dotted
4 are gray and striped
Example:
1. Suppose someone draws a ball from
the urn and tells us it is red. What is
the probability that it is dotted.
P(D/R) = P(DR)/P(R)
= 3/4 or .75
P(S/R)?
P(S/R) =
P(SR)/P(R) =1/4 or .25
1.0
Example:
3. What is the P(D/G)? P(S/G)?
P(D/G) = P(DG)/P(G)
= 2/6 = 1/3 or .33
P(S/G) = 4/6 = 2/3 or .67
1.0
Example:
3. What is the P(R/D)? P(G/D)?
P(R/D) = P(RD)/P(D)
= 3/5 or .60
P(G/D) = 2/5 or .40
1.0
Example:
4. What is the P(R/S)? P(G/S)?
P(R/S) = P(RS)/P(S)
= 1/5 or .20
P(G/S) = 4/5 or .80
1.0
Joint Probabilities under statistical
dependence
P(AB) = P(A/B) x P(B)
P(RD) = P(R/D) x P(D)
= .6 x .5 = .3
P(RS) = P(R/S) x P(S)
= .2 x .5 = .10
P(GD) = P(G/D) x P(D)
= .4 x.5 = .20
P(GS) = P(G/S) x P(S)
= .8 x .5 = .40

You might also like