0% found this document useful (0 votes)
4 views

Chapter - 04 - Probability and Probability Distribution

Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Chapter - 04 - Probability and Probability Distribution

Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 117

Introduction to Probability

and Statistics
Twelfth Edition

Chapter 4
Probability and Probability
Distributions
Some graphic screen captures from Seeing Statistics ® Copyright ©2006 Brooks/Cole
Some images © 2001-(current year) www.arttoday.com  A division of Thomson Learning, Inc.
The Role of Probability in Statistics
• Probability and statistics are related in an important way.
Probability is used as a tool; it allows you to evaluate the
reliability of your conclusions about the population when you
have only sample information.
• Tossing a single coin: Fair coin; Is coin fair or biased?
• Statisticians use probability in two ways.
– When the population is known, probability is used to describe
the likelihood of observing a particular sample outcome.
– When the population is unknown and only a sample from that
population is available, probability is used in making
statements about the makeup of the population—that is, in
making statistical inferences.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
What is Probability?
• In Chapters 2 and 3, we used graphs and numerical
measures to describe data sets which were usually
samples.
• We measured “how often” using
Relative frequency = f/n

• As n gets larger,

Sample Population
And “How often”
= Relative frequency Probability
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Events and the Sample Space
• Def.: An experiment is the process by which an
observation (or measurement) is obtained.
• The observation or measurement generated by an
experiment may or may not produce a numerical value.
Examples of experiments are:
– Record an age / a test grade / a test gpa or cgpa
– Tossing a die / a coin and observing the face
– Recording an opinion (yes, no)
– Tossing two coins/dies and observing the faces
– Measuring daily rainfall
– Interviewing a householder regarding dependents
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Basic Concepts
• Def.: A simple event is the outcome that is observed
on a single repetition of the experiment.
– The basic element to which probability is applied.
– One and only one simple event can occur when the
experiment is performed.
• A simple event is denoted by E with a subscript.
• Each simple event will be assigned a probability,
measuring “how often” it occurs.
• Def.: The set of all simple events of an experiment is
called the sample space, S.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
• The die toss and the number that
4.1
appears in upper face:
• Simple events:Sample space:

1 E1
S ={E1, E2, E3, E4, E5, E6}
2 E2
S
3 E3 •E1 •E3
4 •E5
E4
5 •E2 •E6
E5 •E4
6
E6 Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Basic Concepts
• Def.: An event is a collection of one or more simple
events, denoted by a capital letter.
S
•E1 •E3
•The die toss: A •E5
–A: an odd number B
–B: a number > 2 •E2 •E4 •E6

A ={E1, E3, E5}


B ={E3, E4, E5, E6}
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Basic Concepts
• Def.: Two events are mutually exclusive if, when
one event occurs, the other cannot, and vice versa.

•Experiment: Toss a die


–A: observe an odd number NotExclusive
Mutually

–B: observe a number greater than 2


–C: observe a 6 Mutually B and C?
–D: observe a 3 Exclusive
B and D?
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Venn Diagram
• Sometimes it helps to visualize an experiment using a
picture called a Venn diagram.
• The white box represents the sample space, which
contains all of the simple events, represented by
labeled points.
• S = {1, 2, 3, 4, 5, 6}.
• The events A = {1, 3, 5} and B = {1, 2, 3}

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example 4.2 and 4.3

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Tree Diagram
• Some experiments can be generated in stages, and the
sample space can be displayed in a tree diagram.
• Each successive level of branching on the tree
corresponds to a step required to generate the final
outcome.
• Example 4.4: A medical technician records a
person’s blood type and Rh factor. List the simple
events in the experiment.
• Sol: For each person, a two-stage procedure is needed
to record the two variables of interest. The eight
simple events in the tree diagram form the sample
space, S = {A+, A-, B+, B-, AB+, Copyright
AB-, ©2006
O+,Brooks/Cole
O-}.
A division of Thomson Learning, Inc.
Tree Diagram

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Probability Table
• An alternative way to display the simple events is to
use a probability table.
• The columns and rows show the possible outcomes at
the first and second stages, respectively, and the
simple events are shown in the cells of the table.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Calculating Probabilities using
Simple Events
• The probability of an event A measures “how often”
(relative frequency) we think A will occur.
• Suppose that an experiment is performed n times. The
relative frequency for an event A is
N u m b e r o f tim e s A o c c u rs f

n n
• If n, the number of repetitions of the experiment,
become larger and larger (n  ∞ ), it will eventually
generate the entire population. In this population, the
relative frequency of the event A is defined as the
probability of event A
f
P ( A )  lim
 n ©2006 Brooks/Cole
n Copyright
A division of Thomson Learning, Inc.
Requirements for Simple
Event Probabilities
• Since the simple events are mutually exclusive, their
probabilities must satisfy two conditions.
• 1. Since P(A) behaves like a relative frequency, P(A)
must be between 0 and 1.
– If event A can never occur, P(A) = 0.
– If event A always occurs when the experiment is
performed, P(A) = 1.
– The closer P(A) is to 1, the more likely it is that A
will occur.
• 2. The sum of the probabilities for all simple events
in S (sample space) equals 1.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Probability of an Event
• When it is possible to write down the simple events
associated with an experiment and to determine their
respective probabilities, we can find the probability
of an event
• Def.: The probability of an event A is equal to the
sum of the probabilities of the simple events
contained in A.
• Example: 4.5, 4.6, 4.7

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Finding Probabilities
• Probabilities can be found using
– Estimates from empirical studies
– Common sense estimates based on equally
likely events.

•Examples:
–Toss a fair coin. P(Head) = 1/2
–10% of the U.S. population has red hair.
Select a person at random. P(Red hair) = .10
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• Toss a fair coin twice. What is the probability of
observing at least one head?

1st Coin 2nd Coin Ei P(Ei)


H HH 1/4 P(at least 1 head)
H
T HT 1/4 = P(E1) + P(E2) + P(E3)
H TH 1/4 = 1/4 + 1/4 + 1/4 = 3/4
T
T TT 1/4

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
• A bowl contains three M&Ms®, one red, one blue and
one green. A child selects two M&Ms at random.
What is the probability that at least one is red?

1st M&M 2nd M&M Ei P(Ei)


m RB
m 1/6
m RG
1/6 P(at least 1 red)
m BR
m 1/6 = P(RB) + P(BR)+ P(RG)
m + P(GR)
BG
1/6
m = 4/6 = 2/3
m GB
m
1/6
GR
1/6 Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Probability of an Event
• In calculation of probability, someone must always be
careful to:
– Include all simple events in the sample space.
– Assign realistic probabilities to the simple events.
• When the sample space is large, it is easy to
unintentionally omit some of the simple events. If this
happens, or if the assigned probabilities are wrong, the
answers will not be useful in practice.
• One way to determine the required number of simple
events is to use the counting rules that can be used to
solve more complex problems, which generally involve a
large number of simple events.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Useful Counting Rules
• Suppose that an experiment involves a large number N of
simple events and you know that all the simple events are
equally likely. Then each simple event has probability 1/N,
and the probability of an event A can be calculated as

n A number of simple events in A


P( A)  
N total number of simple events

• You can use counting rules to find nA and N.


• Then, without actually listing all the simple events
we can calculate P(A). Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The mn Rule
• If an experiment is performed in two stages, with m
ways to accomplish the first stage and n ways to
accomplish the second stage, then there are mn ways
to accomplish the experiment.
• Suppose that you can order a car in one of three styles
and in one of four paint colors. To find available
options, you can think of first picking one of the m =
3 styles and then selecting one of the n = 4 paint
colors. So, mn = (3)(4) = 12 possible options.

Example: Toss two coins. The total number of


simple events is:
22=4
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Examples
m

Example: Toss three coins. The total number of


simple events is:
222=8

Example: Toss two dice. The total number of


simple events is: 6  6 = 36

Example: Two M&Ms are drawn from a dish


containing two red and two blue candies. The total
number of simple events is:
4  3 = 12

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
The Extended mn Rule

• Example 4.11
• A second useful counting rule follows from the mn Rule
and involves orderings or permutations.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Permutations

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Permutations

Example: How many 3-digit lock combinations


can we make from the numbers 1, 2, 3, and 4?
The order of the choice is 4!
4
important!
P   4(3)(2)  24
3
1Copyright
! ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Examples
Example: A lock consists of five parts and can be
assembled in any order. A quality control engineer
wants to test each order for efficiency of assembly.
How many orders are there?

The order of the choice is


important!
5!5
P   5(4)(3)(2)(1)  120
5
0!
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Combinations
• Sometimes the ordering or arrangement of the objects
is not important, but only the objects that are chosen.
• In this case, you can use a counting rule for
combinations.
• For example, you may not care in what order the
books are placed on the shelf, but only which books you
are able to shelve.
• When a five-person committee is chosen from a group
of 12 students, the order of choice is unimportant
because all five students will be equal members of the
committee.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Combinations

Book Example: 4.14, 4.15


Example: Three members of a 5-person committee must
be chosen to form a subcommittee. How many different
subcommittees could be formed?

The order of 5 5! 5(4)(3)(2)1 5(4)


C 
3    10
the choice is 3!(5  3)! 3(2)(1A)( 2)1of Thomson
(2Learning,
)1 Inc.
Copyright ©2006 Brooks/Cole
division
Example m
m m
m mm
• A box contains six M&Ms®, four red
and two green. A child selects two M&Ms at random.
What is the probability that exactly one is red?

2!
6 6! 6(5) C  2
2
The order of C2    15 1
1!1!
2!4! 2(1)
the choice is ways to choose
not important! ways to choose 2 M & Ms.
1 green M & M.
4 4!
C 
1 4 4  2 = 8 ways to
1!3! choose 1 red and 1 P(exactly one
ways to choose green M&M. red) = 8/15
1 red M & M.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Event Relations
• Sometimes the event of interest can be formed as a combination of
several other events.
• Let A and B be two events defined on the sample space S.
• Def.: The union of events A and B, denoted by A  B, is the event
that either A or B or both occur when the experiment is performed.

A  B A B

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Event Relations
• Def.: The intersection of two events, A and B, is
the event that both A and B occur when the
experiment is performed. We write A B.

A  B A B

• If two events A and B are mutually exclusive,


then P(A B) = 0.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Event Relations
• Def.: The complement of an event A consists of all outcomes of the experiment that do not result in event A. We write AC.

• Any simple event in the shaded area is a possible outcome resulting in the appropriate event.
• The probabilities of the union, the intersection, or the complement is to sum the probabilities of all the associated simple events.

S
AC A

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
• Select a student from the classroom and
record his/her hair color and gender.
– A: student has brown hair
– B: student is female
Mutually exclusive; B = CC
– C: student is male
•What is the relationship between events B and C?
•AC: Student does not have brown hair
•BC: Student is both male and female = 
•BC: Student is either male or female = all students = S
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
Toss a coin twice
– A: At least one head {HH, HT, TH};
– B: Exact one head {HT, TH};
– C: At least one tail {HT, TH, TT};
– D: Exact one tail {TH, HT}.
•AC: {TT} No head
•AB: {HT, TH} Exact one head
•AC: {HH, HT, TH, TT}=S -- Sample space
•BD ? BD
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Event Relations Extended
• The concept of unions and intersections can be
extended to more than two events.
• For example, the union of three events A, B, and
C, which is written as A  B  C, is the set of
simple events that are in A or B or C or in any
combination of those events.
• Similarly, the intersection of three events A, B,
and C, which is written as A  B  C, is the
collection of simple events that are common to the
three events A, B, and C.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Calculating Probabilities for
Unions and Complements
• There are special rules that will allow you to calculate probabilities
for composite events.
• The Additive Rule for Unions:
• For any two events, A and B, the probability of their union, P(A
B), is

P ( A  B )  P ( A)  P ( B )  P ( A  B ) A B
• In the Venn diagram the sum P(A) + P(B) double counts the simple
events that are common to both A and B. Subtracting P(A  B) gives
the correct result.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example: Additive Rule A B

Example: Suppose that there were 120 students


in the classroom, and that they could be classified
as follows:
A: brown hair Brown Not Brown
P(A) = 50/120 Male 20 40
B: female
P(B) = 60/120 Female 30 30

P(AB) = P(A) + P(B) – P(AB)


= 50/120 + 60/120 - 30/120
= 80/120 = 2/3 Check: P(AB)
= (20 + 30
Copyright + Brooks/Cole
©2006 30)/120
A division of Thomson Learning, Inc.
A Special Case
When two events A and B are mutually
exclusive, P(AB) = 0
and P(AB) = P(A) + P(B).

A: male with brown hair Brown Not Brown


P(A) = 20/120 Male 20 40
B: female with brown hair
P(B) = 30/120 Female 30 30

A and B are mutually P(AB) = P(A) + P(B)


= 20/120 + 30/120
exclusive, so that
= 50/120
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example: Additive Rule A B

Example: Suppose that there were 1000


students in a college, and that they could be
classified as follows:
A: Colorblind Male (B) Female
P(A) = 42/1000=.042 Colorblind (A) 40 2
B: Male
P(B) = 510/1000=.51 Not Colorblind 470 488

P(AB) = P(A) + P(B) – P(AB)


= 42/1000 + 510/1000 - 40/1000 Check: P(AB)
= 512/1000 = .512
= (40 + 2 + 470)/1000
=.512
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
A Special Case
When two events A and B are mutually
exclusive, P(AB) = 0
and P(AB) = P(A) + P(B).

Male Female
A: male and colorblind
P(A) = 40/1000 Colorblind 40 2
B: female and colorblind Not Colorblind 470 488
P(B) = 2/1000

A and B are mutually P(AB) = P(A) + P(B)


exclusive, so that = 40/1000 + 2/1000
= 42/1000=.042
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Calculating Probabilities AC
A
for Complements
• We know that for any event A:
– P(A AC) = 0
• Since either A or AC must occur,
P(A AC) =1
• so that P(A AC) = P(A)+ P(AC) = 1

P(AC) = 1 – P(A)

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
AC
Example A

Select a student at random from the


classroom. Define:

A: male Brown Not Brown


P(A) = 60/120 Male 20 40
B: female
Female 30 30

A and B are P(B) = 1- P(A)


complementary, so that = 1- 60/120 = 60/120

Book Example: 4.17, 4.18 Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence, Conditional Probability
and the Multiplication Rule
• Addition Rule to calculate P(A  B) can be done directly if we find
P(A  B) directly from the probability table. But sometimes the
intersection probability is unknown.
• In this situation, there is a probability rule that can be used to
calculate the probability of the intersection of several events. This
rule depends on the important statistical concept of independent or
dependent events.

Two events, A and B, are said to be independent if


and only if the probability of event B is not influenced
or changed by the occurrence of event A, or vice versa.
Explanation: Colorblindness and Tossing Dice
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Conditional Probabilities
• The probability of an event A, given that the event B
has occurred, is called the conditional probability of
A, given that B has occurred, denoted by P(A|B).
P( A  B)
P( A | B)  if P( B)  0
P( B)

“given”

What is highest and lowest conditional probability?


Example 4.19 Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probabilities

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example 1
• Toss a fair coin twice. Define
– A: head on second toss
– B: head on first toss

P(A|B) = ½ P(B|A) = ?
HH 1/4 P(A|not B) = ½ P(B|not A) = ?
HT 1/4
P(A) does not A and B are
TH 1/4
change, whether independent!
1/4 B happens or
TT
not…
Copyright ©2006 Brooks/Cole
Toss a fair coin thrice for same A and B events?
A division of Thomson Learning, Inc.
Example 2
• A bowl contains five M&Ms®, two red and three blue.
Randomly select two candies, and define
– A: second candy is red.
– B: first candy is blue.

P(A|B) =P(2nd red|1st blue)= 2/4 = 1/2


m P(A|not B) = P(2nd red|1st red) = 1/4
m m

m
P(A) does change,
m
depending on A and B are
whether B happens dependent!
or not…

Colorbliendness Continued Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Defining Independence
• We can redefine independence in terms of conditional
probabilities:
• When two events are independent—that is, if the
probability of event A is the same, whether or not event
B has occurred, then event B does not affect event A and

Two events A and B are independent if and only if


P(AB) = P(A) or P(B|A) = P(B)
Otherwise, they are dependent.
dependent
• Once you’ve decided whether or not two events are
independent, you can use the following rule to
calculate their intersection. Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Everyday English Vs
Probability Independence

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
The Multiplicative Rule for
Intersections
• For any two events, A and B, the probability that both
A and B occur is

P(A B) = P(A) P(B given that A occurred)


= P(A)P(B|A)
• If the events A and B are independent, then the
probability that both A and B occur is

P(A B) = P(A) P(B)


Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Multiplicative Rule for
Intersections

Coin tosses at Football games

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example 1
In a certain population, 10% of the people can be
classified as being high risk for a heart attack. Three
people are randomly selected from this population.
What is the probability that exactly one of the three are
high risk?
Define H: high risk N: not high risk
P(exactly one high risk) = P(HNN) + P(NHN) + P(NNH)
= P(H)P(N)P(N) + P(N)P(H)P(N) + P(N)P(N)P(H)
= (.1)(.9)(.9) + (.9)(.1)(.9) + (.9)(.9)(.1)= 3(.1)(.9)2 = .243

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example 2
Suppose we have additional information in the
previous example. We know that only 49% of the
population are female. Also, of the female patients, 8%
are high risk. A single person is selected at random. What
is the probability that it is a high risk female?
Define H: high risk F: female
From the example, P(F) = .49 and P(H|F) = .08.
Use the Multiplicative Rule:
P(high risk female) = P(HF)
= P(F)P(H|F) =.49(.08) = .0392
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
m

Example 3
m m
m mm

2 green and 4 red M&Ms are in a box; Two of them


are selected at random.
A: First is green;
B: Second is red.
• Find P(AB).

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Method 1 m
m m
• Choose 2 MMs out of 6. Order is recorded. (Total m mm
number of ways, i.e. size of sample space S)
The order of 6 6!
P2 
the choice is (6  2)!
important! 6!
Permutation   6(5)  30 P( A  B)
4!
• Event AB: First green, second red  # A  B
#S
First green C12  2 8
2  4 = 8 ways to 
choose first green and 30
Second Red second red
C14  4
( mn Rule)
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Method 2 m
m m
m mm
A: First is green; B: Second is red;
AB: First green, second red
P(A) 2/6
P(B|A) P(Second red | First green)=4/5

P(A B) = P(A)P(B| P(A B) = 2/6(4/5)=8/30


A)
P(B) 4/6
P(A|B) P(First green | Second red)=2/5
P(A B) = P(B)P(A| P(A B) = 4/6(2/5)=8/30
B)
• Tree diagram? Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Question?
• Q1. 2 green and 4 red M&Ms are in a box; Three of
them are selected at random.
A: First is green; B: Second is red; C: Third is green
• Find P(A  B  C)
• Q2. 2 green, 4 red and 5 blue M&Ms are in a box;
Three of them are selected at random.
A: First is red; B: Second is green; C: Third is blue
or
any combination of A, B and C
• Find P(A  B  C).
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Independent Events

• Example 4.20
• Example 4.21

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example 4
Select a student at random from the
college. Define:
A: Male
Male (A) Female
B: Colorblind
Colorblind (B) 40 2
Find P(A), P(A|B)
Are A and B independent? Not Colorblind 470 488

P(A) = 510/1000=.51
P(B) = 42/1000=.042 P(AB) = 40/1000=.040
P(A|B) = P(AB)/P(B)=.040/.042=.95
P(A|B) and P(A) are not equal. A, B are Copyright
dependent ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Mutually Exclusive Vs Independent
Events

• Example 4.22
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Probability Rules & Relations of Events
Complement Event P ( Ac )  1  P( A)

Additive Rule P( A  B)  P( A)  P( B)  P( A  B)

Multiplicative Rule P ( A  B )  P ( A) P ( B | A)

Conditional probability P( A | B)  P( A  B)
P( B)

P( A  B)  0
Mutually Exclusive Events P( A  B)  P( A)  P( B)

P ( A  B )  P ( A) P ( B )
Independent Events P ( A | B )  P ( A)
P ( A  B  C ) Copyright
P ( A ) P( B) P (C )
©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Law of Total Probability
• Let S1 , S2 , S3 ,..., Sk be mutually exclusive and
exhaustive events (that is, one and only one must
happen). Then the probability of another event A can
be written as
P(A) = P(A  S1) + P(A  S2) + … + P(A  Sk)
= P(S1)P(A|S1) + P(S2)P(A|S2) + … + P(Sk)P(A|Sk)

• The result is known as the Law of Total Probability.


Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Law of Total Probability
S1

A Sk
A

A  S1
Sk
S2….

P(A) = P(A  S1) + P(A  S2) + … + P(A  Sk)


= P(S1)P(A|S1) + P(S2)P(A|S2) + … + P(Sk)P(A|Sk)
• Example 4.23: Sneakers
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Conditional Probability
• Often it is needed to find the conditional probability of an
event B, given that an event A has occurred.
• The experiment involves selecting a sample from one of k
subpopulations that are mutually exclusive and exhaustive;
that is, taken together they make up the entire sample space.
Each of these subpopulations, denoted by S1, S2, . . . , Sk, has
a selection probability P(S1), P(S2), P(S3), . . . , P(Sk), called
prior probabilities.
• An event A is observed in the selection. What is the
probability that the sample came from subpopulation Si,
given that A has occurred?

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Conditional Probability

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Bayes’ Rule
• Let S1 , S2 , S3 ,..., Sk be mutually exclusive and exhaustive events with prior
probabilities P(S1), P(S2),…,P(Sk). If an event A occurs, the posterior probability
of Si, given that A occurred is

• Example 4.24 P( Si ) P( A | Si )
P( Si | A)  for i  1, 2 ,...k
 P( Si ) P( A | Si )

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
From a previous example, we know that 49% of the
population are female. Of the female patients, 8% are
high risk for heart attack, while 12% of the male patients
are high risk. A single person is selected at random and
found to be high risk. What is the probability that it is a
male? Define H: high risk F: female M: male

We know: P( M ) P( H | M )
.49
P( M | H ) 
P(F) = P( M ) P( H | M )  P( F ) P( H | F )
P(M) = .51 .51 (.12)
  .61
P(H|F) = .08 .51 (.12)  .49 (.08)
P(H|M) = .12
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Random Variables
• Variables are defined as characteristics that change or
vary over time and/or for different individuals or
objects under consideration.
• Quantitative variables generate numerical data,
whereas qualitative variables generate categorical data.
• However, even qualitative variables can generate
numerical data if the categories are numerically coded
to form a scale.
• For example, if you toss a single coin, the qualitative
outcome could be recorded as “0” if a head and “1” if a
tail.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Random Variables
• Def. A quantitative variable x is a random variable if
the value that it assumes, corresponding to the outcome
of an experiment is a chance or random event.
• Random variables can be discrete or continuous.
• Toss a die and measure x, the number observed on the
upper face. The variable x can take on any of six values
—1, 2, 3, 4, 5, 6—depending on the random outcome
of the experiment.
• Examples:
x = SAT score for a randomly selected student
x = number of people in a room at a randomly
selected time of day
x = Number of defects on a randomly selected
Copyright ©2006 Brooks/Cole
A division of piece
Thomson Learning, Inc.
Random Variables

A random variable is a numerical description of the


outcome of an experiment.

A discrete random variable may assume either a


finite number of values or an infinite sequence of
values.

A continuous random variable may assume any


numerical value in an interval or collection of
intervals.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example: JSL Appliances
• Discrete random variable with a finite number of values

Let x = number of TVs sold at the store in one day,


where x can take on 5 values (0, 1, 2, 3, 4)
• Discrete random variable with an infinite sequence of
values
Let x = number of customers arriving in one day,
where x can take on the values 0, 1, 2, . . .
• We can count the customers arriving, but there is no
finite upper limit on the number that might arrive.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Random Variables

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Probability Distributions
• The relative frequency distribution for a set of numerical
measurements on a variable x gives information about x:
– What values of x occurred
– How often each value of x occurred
• We define the probability distribution for a random
variable x as the relative frequency distribution
constructed for the entire population of measurements.

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Probability Distributions for
Discrete Random Variables
• The probability distribution for a discrete random variable x is
a graph, table or formula that gives the possible values of x and
the probability p(x) associated with each value of x.
• The values of x represent mutually exclusive numerical events.
Summing p(x) over all values of x is equivalent to adding the
probabilities of all simple events and therefore equals 1.

• Example: 4.25
We must have
0  p ( x)  1 and  p ( x)  1
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Probability Distributions

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Probability Distributions

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Discrete Uniform Probability
Distributions

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
• Toss a fair coin three times and
define x = number of heads.
x x p(x)
HHH
1/8 3 P(x = 0) = 1/8 0 1/8
HHT
1/8
P(x = 1) = 3/8 1 3/8
2
HTH P(x = 2) = 3/8 2 3/8
1/8 2 P(x = 3) = 1/8
THH 3 1/8
1/8 2
HTT
1/8 1 Probability
THT 1/8 Histogram for x
1
TTH 1/8 1
TTT 1/8 0 Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Mean and Standard Deviation for
Discrete Random Variable
• The probability distribution for a discrete random
variable looks very similar to the relative frequency
distribution discussed in Chapter 1.
• The difference is that the relative frequency distribution
describes a sample of n measurements, whereas the
probability distribution is constructed as a model for the
entire population of measurements.
• Just as the mean and the standard deviation s
measured the center and spread of the sample data, you
can calculate similar measures to describe the center and
spread of the population.
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Probability Distributions
• Probability distributions can be used to describe the
population, just as we described samples in Chapter 1.
– Shape: Symmetric, skewed, mound-shape.
– Outliers: unusual or unlikely measurements
– Center and spread: mean and standard deviation. A
population mean is called  (mu) and a population
standard deviation is called  (sigma).
• The population mean, which measures the average
value of x in the population, is also called the expected
value of the random variable x and is written as E(x).
• It is the value that you would expect to observe on
average if the experiment is repeated over and over
again. Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Mean
and Standard Deviation
From Book: Page 161

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
The Mean
and Standard Deviation

( xi ) 2
2
2  xi 
 ( x  x ) n
• Example: 2.26, 2.27, 2.28 s2  i

n 1 n 1
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
The Mean
and Standard Deviation
• Let x be a discrete random variable with probability
distribution p(x). Then the mean, variance and
standard deviation of x are given as

Mean :    xp( x)
2 2
Variance :   ( x   ) p ( x)
2
Standard deviation :   

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Expected Value and Variance

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Expected Value and Variance

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Variance and Standard Deviation

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Example
• Toss a fair coin 3 times and
record x the number of heads.
x p(x) xp(x) (x-2p(x) 12
0 1/8 0 (-1.5)2(1/8)    xp( x)  8  1.5
1 3/8 3/8 (-0.5)2(3/8)
2 3/8 6/8 (0.5)2(3/8) 2 2
  ( x   ) p ( x)
3 1/8 3/8 (1.5) (1/8)
2

2
  .28125  .09375  .09375  .28125  .75
  .75  .688
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• Toss a fair coin 3 x p(x) (x-2p(x)
times and record x 0 1/8 (0-1.5)2(1/8)=.28125
the number of heads. 1 3/8 (1-1.5)2(3/8)=.09375
2 3/8 (2-1.5)2(3/8)=.09375
• Find variance by the 3 1/8 (3-1.5)2(1/8)=.28125
definition formula.
Total .75
  E ( x)   xp( x)  1.5
 2  ( x   ) 2 p ( x)
 0.75
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• Toss a fair coin 3 times x p(x) x2p(x)
and record x the
number of heads. 0 1/8 02(1/8)=0
1 3/8 12(3/8)=0.375
• Find the variance by the 2 3/8 22(3/8)=1.5
computational formula. 3 1/8 32(1/8)=1.125
Total 3
  E ( x)   xp( x)  1.5
2 2 2
   x p( x)  
 3  1.5 2
 0.75
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• The probability distribution for x the
number of heads in tossing 3 fair coins.

Symmetric;
• Shape? mound-shaped
• Outliers? None
• Center?  = 1.5
• Spread?  = .688


Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• Book Example: 4.27, 4.28
• In a lottery, 8,000 tickets are sold at $5 each. The prize is
a $12,000 automobile and only one ticket will be the
winner. If you purchased two tickets, your expected
gain?

Define x = your gain. x = -10 or 11,990


x p(x)
-$10 7998/8000
$11,990 2/8000
μ = E(x) = Σ xp(x)
= (-10) (7998/8000)+(11,990)(2/8000)= -$7
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Example
• For a casino game, it has
Expected Value :
  0(0.8)  5(0.2)  1
probability .2 of winning $5
and probability .8 of nothing. x p(x) (x-2p(x)
• x is money won in a game. 0 0.8 (0-1)2(0.8)=0.8
• Calculate the variance of x. 5 0.2 (5-1)2(0.2)=3.2
Total 4

 2  ( x   ) 2 p ( x)
x p(x) x2p(x) 4
0 0.8 02(0.8)=0
5 .2 5 (0.2)=5
2  2   x 2 p( x)   2
Total 5  5  12
4 Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Key Concepts
I. Experiments and the Sample Space
1. Experiments, events, mutually exclusive events,
simple events
2. The sample space
3. Venn diagrams, tree diagrams, probability tables
II. Probabilities
1. Relative frequency definition of probability
2. Properties of probabilities
a. Each probability lies between 0 and 1.
b. Sum of all simple-event probabilities equals 1.
3. P(A), the sum of the probabilities for all simple events in A
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Key Concepts
III. Counting Rules
1. mn Rule; extended mn Rule
2. Permutations: n!
Pn 
r
(n  r )!
n!
3. Combinations: Crn 
r!(n  r )!
IV. Event Relations
1. Unions and intersections
2. Events
a. Disjoint or mutually exclusive: P(A B)  0
b. Complementary: P(A)  1  P(AC )

Copyright ©2006 Brooks/Cole


A division of Thomson Learning, Inc.
Key Concepts
P( A  B)
P( A | B) 
3. Conditional probability: P( B)
4. Independent and dependent events
5. Additive Rule of Probability:
P( A  B)  P( A)  P( B)  P( A  B)

6. Multiplicative Rule of Probability:


P ( A  B )  P ( A) P ( B | A)

7. Law of Total Probability


8. Bayes’ Rule
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.
Key Concepts
V. Discrete Random Variables and Probability
Distributions
1. Random variables, discrete and continuous
2. Properties of probability distributions
0  p ( x)  1 and  p ( x)  1
3. Mean or expected value of a discrete random
variable: Mean :    xp( x)
4. Variance and standard deviation of a discrete
random variable: Variance :  2  ( x   ) 2 p( x)
Standard deviation :    2
Copyright ©2006 Brooks/Cole
A division of Thomson Learning, Inc.

You might also like