Chapter 2:
PROBABILITY
LEARNING OBJECTIVES
1. Sample spaces and events
2. Interpretations of Probability
3. Addition rules
4. Conditional probability
5. Multiplication and total probability rules
6. Independence
7. Bayes’ theorem
8. Random variables
  Sample spaces and events
Example
 When a six-sided die is rolled,
the sample space :
 S = {1, 2, 3, 4, 5, 6}.
The event A that an even number
  is obtained = {2, 4, 6}.
The event B that a number
 greater than 2 is obtained
  = {3, 4, 5, 6}.
  Sample spaces and events
Definition
• An experiment that can result in different outcomes, even
  though it is repeated in the same manner everytime, is called a
  random experiment.
• The set of all possible outcomes of a random experiment is
  called the sample space (denoted as S)
• An event is a subset of the sample space of a random
  experiment.
Sample spaces and events
Example 1: A probability experiment consists of tossing a coin
and then rolling a six-sided die. Describe the sample space.
Tree diagram:
          H1 H2 H3 H4 H5 H6            T1 T2 T3 T4 T5 T6
   The sample space has 12 outcomes:
   S = {H1, H2, H3, H4, H5, H6, T1, T2, T3, T4, T5, T6}
Sample spaces and events
Example 2: Each message in a digital communication system is
classified as to whether it is received within the time specified by
the system design. If three messages are classified, use a tree
diagram to represent the sample space of possible outcomes.
  Sample spaces and events
  Basic Set Operations
The union of two events is the event that consists of all
outcomes that are contained in either of the two events. We
denote the union as E1E2.
The intersection of two events is the event that consists of all
outcomes that are contained in both of the two events. We
denote the intersection as E1E2.
The complement of an event in a sample space is the set of
outcomes in the sample space that are not in the event. We
denote the component of the event E as E’.
Sample spaces and events
Venn Diagrams
                     mutually exclusive
 Sample spaces and events
Important properties:
       A  (B  C) = (A  B)  C
       A  (B  C) = (A  B)  C
       A  (B  C) = (A  B)  (A  C)
       (A  B)’ = A’ B’
       (A  B)’ = A’ B’
       A = (A  B)  (A  B’)
Interpretations of Probability
Introduction
 There are different approaches to assessing the probability of
    an uncertain event:
 1. a priori classical probability: the probability of an event
    is based on prior knowledge of the process involved.
 2. empirical classical probability: the probability of an event
     is based on observed data.
 Interpretations of Probability
 Equally Likely Outcomes
 Whenever a sample space consists of N possible outcomes that
 are equally likely, the probability of each outcome is 1/N.
1. a priori classical probability
2. empirical classical probability
                                    number of favorable outcomes observed
     Probabilit y of Occurrence 
                                      total number of outcomes observed
Interpretations of Probability
Example: Find the probability of selecting a face card
(Jack, Queen, or King) from a standard deck of 52 cards.
                                  X   number of face cards
    Probabilit y of Face Card      
                                  T   total number of cards
                                  X   12 face cards 3
                                                   
                                  T   52 total cards 13
  Interpretations of Probability
Example: Find the probability of selecting a male taking statistics
from the population described in the following table:
 Interpretations of Probability
Axioms of Probability
Probability is a number that is assigned to each member of a
collection of events from a random experiment that satisfies the
following properties:
(S is the sample space and E is any event)
1. P(S) = 1
2. 0 ≤ P(E) ≤ 1
3. For two events E1 and E2 with E1∩E2 = Ø
                   P(E1  E2) = P(E1) + P(E2)
 Interpretations of Probability
 For a discrete sample space, the probability of an event E, denoted
 as P(E ), equals the sum of the probabilities of the outcomes in E.
Example 1: A random experiment can result in one of the outcomes
S = {a, b, c, d} with probabilities 0.1, 0.3, 0.5, and 0.1, respectively.
  Let A = {a, b}, B = {b, c, d}, A’= S \ A, B’ = S \ B.
Find P(A); P(B); P(A’); P(B’); P(A∩ B); P(A∪ B).
Example 2: A visual inspection of a            Number of      Proportion of
                                             contamination       wafers
location on wafers from a                       particles
semiconductor manufacturing              0                   0.40
process resulted in the following
                                         1                   0.15
table.
                                         2                   0.20
What is the probability that a wafer
contains three or more particles in      3                   0.10
the inspected location?                  4 or more           0.15
  Complement rule
If the complement of A, denoted by consists of all the
outcomes in which the event A does not occur, then we have:
                    𝑃 𝐴 + 𝑃 𝐴 ) =1
                           ( ′
                     ( )
Remark: Depending on the problem, it may be easier to find
and then use the above equation to find P(A).
Example: A number is chosen at random from a set of whole
numbers from 1 to 50. Calculate the probability that the chosen
number is not a perfect square.
Addition rules
1. If A and B are mutually exclusive events,
                     P(A  B) = P(A) + P(B)
2. A collection of events, E1, E2, …, Ek is said to be mutually
exclusive if for all pairs,
                            Ei ∩ Ej = Ø
For a collection of mutually exclusive events,
  𝑃 ( 𝐸1 ∪ 𝐸2 ∪ … ∪ 𝐸 𝑘) =𝑃 ( 𝐸 1 ) + 𝑃 ( 𝐸2 ) + …+ 𝑃 ( 𝐸𝑘 )
3. General: If A and B are any events,
           P(A  B) = P(A) + P(B) – P(A ∩ B)
  Addition rules
Example: Find the probability of selecting a male or a statistics
student from the population described in the following table:
  P(Male or Stat) = P(M) + P(S) – P(M or S)
                 = 229/439 + 160/439 – 84/439 = 305/439
Conditional Probability
The conditional probability of an event B, given that an
event A already occurred, is denoted by P(B|A).
  Conditional Probability
Definition
The conditional probability of an event B given an event A,
denoted as P(B|A), is computed as
                         P(B|A) =    .
Special case: All outcomes are equally likely
  Conditional Probability
Example: Of the cars on a used car lot, 70% have air
conditioning (AC) and 40% have a CD player (CD). 20% of the
cars have both.
What is the probability that a car has a CD player, given that it
has AC ?
                                                  P(CD and AC)
                                   P(CD | AC) 
                                                      P(AC)
                                                  0.2
                                                      .2857
                                                  0.7
  Multiplication rule
Multiplication Rule
               P(A ∩ B) = P(A|B)P(B) = P(B|A)P(A)
Example: The probability that an automobile battery subject to high
engine compartment temperature suffers low charging current is 0.7.
The probability that a battery is subject to high engine compartment
temperature is 0.05.
The probability that a battery is subject to low charging current and
high engine compartment temperature is
 C={a battery suffers low         T ={a battery is subject to high engine
 charging current}                compartment temperature}
Total Probability Rule
    Partitioning an event
    into two mutually
    exclusive subsets.
                            Partitioning an
                            event into several
                            mutually
                            exclusive subsets.
   Total Probability Rule
Total Probability Rule: two events
𝑃 ( 𝐵 )=𝑃 ( 𝐵 ∩ 𝐴 )+ 𝑃 ( 𝐵∩ 𝐴 ′ ) =𝑃 ( 𝐵| 𝐴) 𝑃 ( 𝐴 ) + 𝑃 ( 𝐵| 𝐴′ ) 𝑃 ( 𝐴 ′ )
Total Probability Rule: multiple events
Assume are mutually exclusive and exhaustive events. Then:
  Total Probability Rule
Example: The information of the contamination discussion is
summarized in the following table:
   Probability of Failure   Level of Contamination      Probability of Level
           0.100                     High                       0.2
           0.005                    No high                     0.8
Let F denote the event that the product fail. Find P(F).
Hint:
Let H denote the event that the chip is exposed to high level of contamination.
So P(H) = 0.2 and P(H’) = 0.8.
Moreover, P(F|H) = 0.1 and P(F|H’) = 0.005.
Thus, P(F) = 0.1*0.2 + 0.005*0.8 = 0.024.
  Independence
Definition
Two events are called independent if the occurrence of one event
does not change the probability of the other event. Equivalently,
any one of the following equivalent statements is true:
(1) P(A|B) = P(A)
(2) P(A ∩ B) = P(A)P(B)
(3) P(B|A) = P(B)
Remark: If A and B are independent events, then so are events
A and B’, events A’ and B, and events A’ and B’.
   Independence
Example: A day’s production of 850 manufactured parts contains 50 parts
that do not meet customer requirements. Two parts are selected at random,
without replacement, from the batch.
Let A ={the first part is defective}, and B ={the second part is defective}.
We suspect that these two events are not independent because knowledge that
the first part is defective suggests that it is less likely that the second part
selected is defective.
  Hint:
  P(B|A) = 49/849
                                       Conclusion: the two events are not
                                       independent, as we suspected.
  Independence
Definition
The events E1, E2, …, Ek are independent if and only if for any
  subset of these events
         𝑃 ( 𝐸𝑖 ∩ 𝐸 𝑖 ∩ … ∩ 𝐸𝑖 ) =𝑃 ( 𝐸 𝑖 ) 𝑃 ( 𝐸𝑖 ) … 𝑃 ( 𝐸𝑖 )
               1     2          𝑘         1       2          𝑘
Question: Two coins are tossed.
Let A denote the event “at most one head on the two tosses”
and B denote the event “one head and one tail in both tosses.” Are
A and B independent events?
  Independence
Example: The following circuit operates only if there is a path of functional
devices from left to right. The probability that each device functions is shown
on the graph. Assume that devices fail independently. What is the probability
that the circuit operates?
                                        P(T B) = 1- P[(T B)’]
                                                = 1- P(T’∩B’)
                                       P(T’∩B’) = P(T’)*P(B’)
Let T and B denote the                      = (1 - 0.95)2 =0.052
events that the top and
bottom devices operate,              So P(T B) = 1 - 0.052 =0.9975
respectively.
Bayes’ Theorem
If E1, E2, …, Ek are k mutually exclusive and exhaustive
events and B is any event,
                                       P(B | E1 )P( E1 )
  P( E1 | B) 
                 P(B | E1 )P( E1 )  P(B | E2 )P(E2 )  ... P(B | Ek )P(Ek )
                                                                 for P(B)  0
A special case:
                      P ( B | A) P ( A)
          P( A | B)                                 for P ( B )  0
                           P( B)
  Bayes’ Theorem
Example 1: In a state where cars have to be tested for the
emission of pollutants, 25% of all cars emit excessive amount of
pollutants. When tested, 99% of all cars that emit excessive
amount of pollutants will fail, but 17% of all cars that do not
emit excessive amount of pollutants will also fail. What is the
probability that a car that fails the test actually emits excessive
amounts of pollutants?
Example 2: Two events A and B are such that P(A∩B) = 0.15,
P(A∪B) = 0.65, and P(A|B) = 0.5. Find P(B|A).
Random variables
Definition
A random variable is a function that assigns a real number to
  each outcome in the sample space of a random experiment.
Remark:
1. A random variable is denoted by an uppercase letter such as X. After an
experiment is conducted, the measured value of the random variable is
denoted by a lowercase letter such as x = 70 milli-amperes.
2. A discrete random variable is a random variable with a finite (or
countable infinite) range.
3. A continuous random variable is a random variable with an interval of
real numbers for its range.