0% found this document useful (0 votes)
1 views

Unit 4 Notes

The document covers fundamental concepts in probability, including random processes, events, and the law of large numbers. It explains random variables, probability distributions, and specific types such as binomial and geometric distributions, along with their means and standard deviations. Additionally, it discusses the relationships between events, including independence, conditional probability, and how to combine random variables.

Uploaded by

rishi.nrkr
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Unit 4 Notes

The document covers fundamental concepts in probability, including random processes, events, and the law of large numbers. It explains random variables, probability distributions, and specific types such as binomial and geometric distributions, along with their means and standard deviations. Additionally, it discusses the relationships between events, including independence, conditional probability, and how to combine random variables.

Uploaded by

rishi.nrkr
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Unit 4 Notes – Probability, Random Variables, and Probability

Distributions
TPS - Chapter 5
Estimating Probabilities Using Simulation:
 A random process generates results that are determined by chance.
 An outcome is the result of a trial of a random process.
 An event is a collection of outcomes.
 Simulation is a way to model random events, such that simulated outcomes closely
match real-world outcomes. You could use coins, dice, or random number table as
examples.
 The relative frequency of an outcome or event in simulated or empirical data can be
used to estimate the probability of that outcome or event.
 The law of large numbers states that simulated (empirical) probabilities tend to get
closer to the true probability as the number of trials increases.

Introduction to Probability:
 The sample space of a random process is the set of all possible non-overlapping
outcomes.
 If all outcomes in the sample space are equally likely, then the probability an event E will
occur is defined as the fraction:
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑖𝑛 𝑒𝑣𝑒𝑛𝑡 𝐸
𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑖𝑛 𝑠𝑎𝑚𝑝𝑙𝑒 𝑠𝑝𝑎𝑐𝑒
 The probability of an event is a number between 0 and 1, inclusive.
 The probability of the complement of an event E, E’, or 𝐸 𝐶 (not E) is equal to 1 − 𝑃(𝐸).
 Probabilities of events in repeatable situations can be interpreted as the relative
frequency with which the event will occur in the long run.

Mutually Exclusive Events:


 The probability that events A and B both will occur, sometimes called the joint
probability, is the probability of the intersection of A and B, denoted by 𝑃(𝐴 ∩ 𝐵).
 Two events are mutually exclusive or disjoint if they cannot occur at the same time. So
𝑃(𝐴 ∩ 𝐵) = 0.
Conditional Probability:

 The probability that event A will occur given that event B has occurred is called a
𝑃(𝐴∩𝐵)
conditional probability and denoted by 𝑃(𝐴|𝐵) = .
𝑃(𝐵)
 The multiplication rule states that the probability that events A and B both will occur is
equal to the probability that event A will occur multiplied by the probability that event B
will occur, given that A has occurred. 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) ∙ 𝑃(𝐵|𝐴).
Unit 4 Notes – Probability, Random Variables, and Probability
Distributions
Independent Events and Unions of Events:

 Events A and B are independent if, and only if, knowing whether event A has occurred
(or will occur) does not change the probability that event B will occur.
 If, and only if, events A and B are independent, then 𝑃(𝐴|𝐵) = 𝑃(𝐴), 𝑃(𝐵|𝐴) =
𝑃(𝐵), 𝑎𝑛𝑑 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) ∙ 𝑃(𝐵).
 The probability that event A or event B (or both) will occur is the probability of the
union of A and B, denoted by 𝑃(𝐴 ∪ 𝐵).
 The addition rule states that the probability that event A or event B or both will occur is
equal to the probability that event A will occur plus the probability that event B will
occur minus the probability that both events A and B will occur. This is denoted by
𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵).

TPS - Chapter 6
Introduction to Random Variables and Probability Distributions:

 The values of a random variable are the numerical outcomes of random behavior.
 A discrete random variable is a variable that can only take a countable number of
values. Each value has a probability associated with it. The sum of the possible values
must be 1.
 A probability distribution can be represented as a graph, table, or function showing the
probabilities associated with values of a random variable.
 A cumulative probability distribution can be represented as a table or function showing
the probability of being less than or equal to each value of the random variable.
 An interpretation of a probability distribution provides information about the shape,
center, and spread of a population and allows one to make conclusions about the
population of interest.

Mean and Standard Deviation of Random Variables:

 A numerical value measuring a characteristic of a population or the distribution of a


random variable is known as a parameter, which is a single, fixed value.
 The mean, or expected value, for a discrete random variable X is
𝜇𝑥 = 𝐸(𝑋) = Σ𝑥𝑖 ∙ 𝑃(𝑥𝑖 )
 The standard deviation for a discrete random variable X is 𝜎𝑥 = √Σ(𝑥𝑖 − 𝜇𝑥 )2 ∙ 𝑃(𝑥𝑖 )
 The variance is the standard deviation squared (𝜎 2 )
 Parameters for a discrete random variable should be interpreted using appropriate units
and within the context of a specific population.
Unit 4 Notes – Probability, Random Variables, and Probability
Distributions

Combining Random Variables:

 For random variables X and Y and real numbers a and b, the mean of 𝑎𝑋 + 𝑏𝑌 is
𝑎𝜇𝑋 + 𝑏𝜇𝑌 .
 Two random variables are independent if knowing information about one of them does
not change the probability distribution of the other.
 For independent random variables X and Y and real numbers a and b, the variance of
𝑎𝑋 + 𝑏𝑌 is 𝑎2 𝜎𝑋2 + 𝑏 2 𝜎𝑌2 . (Add Variances – not standard deviations)
 For 𝑌 = 𝑎 + 𝑏𝑋, the probability distribution of the transformed random variable, Y, has
the same shape as the probability distribution for X, so long as 𝑎 > 0 𝑎𝑛𝑑 𝑏 > 0. The
mean of Y is 𝜇𝑌 = 𝑎 + 𝑏𝜇𝑋 . The standard deviation of Y is 𝜎𝑌 = |𝑏|𝜎𝑋 . In other words,
the mean is affected by adding, subtracting, multiplying, or dividing the data by a
constant. The standard deviation is only affected by multiplying or dividing.
Binomial Distribution:

 A binomial random variable, X, counts the number of successes in n repeated


independent trials, each trial having two possible outcomes (success or failure), with the
probability of success p and the probability of failure 1-p.
 The probability that a binomial random variable, X, has exactly x successes for n
independent trials, when the probability of success is p, is calculated as
𝑛
𝑃(𝑋 = 𝑥) = ( ) 𝑝 𝑥 (1 − 𝑝)𝑛−𝑥
𝑥
 Binomialpdf(n,p,x) calculates the probability of exactly x successes.
 Binomialcdf(n,p,x) calculates the probability of x or less success.
 If a random variable is binomial, 𝜇𝑥 = 𝑛𝑝 and 𝜎𝑥 = √𝑛𝑝(1 − 𝑝).

Geometric Distribution:

 For a sequence of independent trials, a geometric random variable, X, gives the number
of the trial on which the first success occurs. Each trial has two possible outcomes
(success or failure) with the probability of success p and the probability of failure 1-p.
 The probability that the first success for repeated independent trials with probability of
success p occurs on trial x is calculated as
𝑃(𝑋 = 𝑥) = (1 − 𝑝)𝑥−1 ∙ 𝑝

1 √1−𝑝
 If a random variable is geometric, 𝜇𝑥 = 𝑝 and 𝜎𝑥 = 𝑝
 Geometpdf(p,x) find the probability the first success occurs of the x trial.
 Geometcdf(p,x) finds the probability the first success occurs in less than or equal to x
trials.

You might also like