0% found this document useful (0 votes)
57 views

AP Stats Probability Cram Guide

This document provides an overview of key probability concepts and formulas for the AP Statistics exam, including: 1) Common probability rules such as the first, second, and third axioms of probability, the complement rule, and the additive law. 2) How to calculate probabilities of events using formulas like conditional probability, Bayes' rule, and the law of total probability. 3) Details on discrete random variables like the binomial and geometric distributions and how to calculate their expected values and standard deviations. 4) How to transform random variables using linear transformations and how this affects their expected values and standard deviations.

Uploaded by

narane ramp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

AP Stats Probability Cram Guide

This document provides an overview of key probability concepts and formulas for the AP Statistics exam, including: 1) Common probability rules such as the first, second, and third axioms of probability, the complement rule, and the additive law. 2) How to calculate probabilities of events using formulas like conditional probability, Bayes' rule, and the law of total probability. 3) Details on discrete random variables like the binomial and geometric distributions and how to calculate their expected values and standard deviations. 4) How to transform random variables using linear transformations and how this affects their expected values and standard deviations.

Uploaded by

narane ramp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

AP Stats Probability Guide

created by Sherrinford (r/APStudents) May 2019 | Reddit | Discord: Sherrinford#0290


Statistical Inference Study Guide

Symbols

Symbol Term What it says

U Union “Inclusive Or” / “At least one event occurs”

∩ Intersection “And”

AC Complement “Not”

A|B Conditional A “Given" B

Rules

**Can be derived using a tree diagram. Formula is listed for those who prefer the mathematical representation.

Rule Formula What it says

First Axiom of P(A) ≥ 0 The probability of any event A cannot be negative.


Probability

Second Axiom of P(S) = 1 The combined probabilities of all possible outcomes in the experiment (sample
Probability space) must add up to 1.

Third Axiom of P(A U B) = P(A or B) = P(A) + P(B) When two events cannot occur at the same time (are disjoint/mutually
Probability exclusive), the probability that at least one occurs is the sum of the individual
probabilities.

Complement Rule P(A) + P(AC) = 1 The probabilities that an event occurs or does not occur must add up to 1.
Conditional P(A | B) = P(A ∩ B) / P(B) The probability of A given B is the probability that both A and B occur over the
Probability P(B | A) = P(A ∩ B) / P(A) probability that B occurs.
P(A ∩ B) = P(A | B)P(B) = P(B | A)P(A)

Additive Law P(A U B) = P(A) + P(B) - P(A ∩ B) The probability that at least one event occurs is the sum of the individual
(Union) probabilities minus the intersection. We subtract the intersection because we
can’t count A and B occuring at the same time twice (counted in P(A) and P(B)).

Independence P(A ∩ B) = P(A)P(B) Independence means that one event occurring doesn’t affect the probability of
P(A | B) = P(A) the other. In addition, the probability of the intersection must be the product of
P(B | A) = P(B) the individual probabilities. If one of these conditions is violated, then the
events are dependent.

Mutually P(A ∩ B) = 0 When two events cannot occur at the same time, the probability of their
Exclusive/Disjoint intersection is zero. **EVENTS CANNOT BE BOTH MUTUALLY EXCLUSIVE AND
INDEPENDENT. (besides if the probability of both events individually is zero, but
we usually don’t care too much about those types of events)

Law of Total P(B) = P(B | A)P(A) + P(B | A)P(AC) -


Probability**

Bayes’ Rule** P(A | B) = P(B | A)P(A) / P(B | A)P(A) + P(B | A)P(AC) -

Discrete Random Variables

Random Variable Conditions Formula Mean/Expected Value Standard Deviation

Discrete Random Countable number of - E(X) = Σ xP(x) σ = √[Σ(x- μ)2P(x)]


Variable possible values (e.g. rolling a Alternatively, σ = √[E(X2) - E(X)2]
die)

Binomial 1. Possible outcomes E(X) = np σ = √[np(1-p)]


(Number of are “success” and
successes in n “failure” (Bernoulli
trials) trials) P(Exactly k successes) -> 2nd -> VARS ->
2. Fixed number of trials binompdf()
3. Independent trials
4. Fixed probability of a P(k or less successes) -> 2nd -> VARS ->
success binomcdf()

Geometric 1. Possible outcomes P(X = k) = (1-p)k-1p E(X) = 1/p σ = √[(1-p)/p2]


(Number of trial on are “success” and
which first success “failure” (Bernoulli P(First success occurs on the kth trial) ->
occurs) trials) 2nd -> VARS -> geometpdf()
2. Number of trials is
NOT fixed P(First success occurs on the kth trial or
3. Independent trials before) -> 2nd -> VARS -> geometcdf()
4. Fixed probability of a
success

Transforming Random Variables

Transformation Mean/Expected Value Standard Deviation

aX + b aE(X) + b |a|σX

aX + bY + c aE(X) + bE(Y) + c √[a2σ2X + b2σ2Y]

**The linear transformation of normal random When X and Y are independent (will always be
variables is also normal. true in this class, dependence requires covariance
which is not covered)

X + X + … + X (a times) + Y + Y + … + Y (b times) + c aE(X) + bE(Y) + c √[aσ2X + bσ2Y]


(Independent random variables)

**CAUTION: NOT the same as scaling a single


random variable by a factor of a or b (above
situations)

You might also like