0% found this document useful (0 votes)
29 views

Binomial Random Variables and Repeated Trials: Scott Sheffield

The document summarizes binomial random variables and properties of expectation and variance for binomial random variables. It discusses how the number of heads from tossing a coin n times is a binomial random variable, and how to calculate the probability of k heads. It then shows that the expectation of a binomial random variable X with parameters (n, p) is np, using an identity related to Pascal's triangle.

Uploaded by

Ed Z
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Binomial Random Variables and Repeated Trials: Scott Sheffield

The document summarizes binomial random variables and properties of expectation and variance for binomial random variables. It discusses how the number of heads from tossing a coin n times is a binomial random variable, and how to calculate the probability of k heads. It then shows that the expectation of a binomial random variable X with parameters (n, p) is np, using an identity related to Pascal's triangle.

Uploaded by

Ed Z
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

18.

600: Lecture 11
Binomial random variables and repeated
trials

Scott Sheffield

MIT
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Answer: kn p k (1 p)nk .

I
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Answer: kn p k (1 p)nk .

I

Writing q = 1 p, we can write this as kn p k q nk



I
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Answer: kn p k (1 p)nk .

I

Writing q = 1 p, we can write this as kn p k q nk



I

I Can use binomial theorem to show probabilities sum to one:


Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Answer: kn p k (1 p)nk .

I

Writing q = 1 p, we can write this as kn p k q nk



I

I Can use binomial theorem to show probabilities sum to one:


1 = 1n = (p + q)n = nk=0 kn p k q nk .
P 
I
Bernoulli random variables

I Toss fair coin n times. (Tosses are independent.) What is the


probability of k heads?
Answer: kn /2n .

I

I What if coin has p probability to be heads?


Answer: kn p k (1 p)nk .

I

Writing q = 1 p, we can write this as kn p k q nk



I

I Can use binomial theorem to show probabilities sum to one:


1 = 1n = (p + q)n = nk=0 kn p k q nk .
P 
I

I Number of heads is binomial random variable with


parameters (n, p).
Examples

I Toss 6 fair coins. Let X be number of heads you see. Then X


is binomial with parameters (n, p) given by (6, 1/2).
Examples

I Toss 6 fair coins. Let X be number of heads you see. Then X


is binomial with parameters (n, p) given by (6, 1/2).
I Probability mass function for X can be computed using the
6th row of Pascals triangle.
Examples

I Toss 6 fair coins. Let X be number of heads you see. Then X


is binomial with parameters (n, p) given by (6, 1/2).
I Probability mass function for X can be computed using the
6th row of Pascals triangle.
I If coin is biased (comes up heads with probability p 6= 1/2),
we can still use the 6th row of Pascals triangle, but the
probability that X = i gets multiplied by p i (1 p)ni .
Other examples

I Room contains n people. What is the probability that exactly


i of them were born on a Tuesday?
Other examples

I Room contains n people. What is the probability that exactly


i of them were born on a Tuesday?
Answer: use binomial formula ni p i q ni with p = 1/7 and

I
q = 1 p = 6/7.
Other examples

I Room contains n people. What is the probability that exactly


i of them were born on a Tuesday?
Answer: use binomial formula ni p i q ni with p = 1/7 and

I
q = 1 p = 6/7.
I Let n = 100. Compute the probability that nobody was born
on a Tuesday.
Other examples

I Room contains n people. What is the probability that exactly


i of them were born on a Tuesday?
Answer: use binomial formula ni p i q ni with p = 1/7 and

I
q = 1 p = 6/7.
I Let n = 100. Compute the probability that nobody was born
on a Tuesday.
I What is the probability that exactly 15 people were born on a
Tuesday?
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
Expectation

I Let X be a binomial random variable with parameters (n, p).


Expectation

I Let X be a binomial random variable with parameters (n, p).


I What is E [X ]?
Expectation

I Let X be a binomial random variable with parameters (n, p).


I What is E [X ]?
I Direct approach:
Pn by definition of expectation,
E [X ] = i=0 P{X = i}i.
Expectation

I Let X be a binomial random variable with parameters (n, p).


I What is E [X ]?
I Direct approach:
Pn by definition of expectation,
E [X ] = i=0 P{X = i}i.
I What happens if we modify the nth row of Pascals triangle by
multiplying the i term by i?
Expectation

I Let X be a binomial random variable with parameters (n, p).


I What is E [X ]?
I Direct approach:
Pn by definition of expectation,
E [X ] = i=0 P{X = i}i.
I What happens if we modify the nth row of Pascals triangle by
multiplying the i term by i?
I For example, replace the 5th row (1, 5, 10, 10, 5, 1) by
(0, 5, 20, 30, 20, 5). Does this remind us of an earlier row in
the triangle?
Expectation

I Let X be a binomial random variable with parameters (n, p).


I What is E [X ]?
I Direct approach:
Pn by definition of expectation,
E [X ] = i=0 P{X = i}i.
I What happens if we modify the nth row of Pascals triangle by
multiplying the i term by i?
I For example, replace the 5th row (1, 5, 10, 10, 5, 1) by
(0, 5, 20, 30, 20, 5). Does this remind us of an earlier row in
the triangle?
I Perhaps the prior row (1, 4, 6, 4, 1)?
Useful Pascals triangle identity

n
 n(n1)...(ni+1)
I Recall that i = i(i1)...(1) . This implies a simple
n n1
 
but important identity: i i = n i1 .
Useful Pascals triangle identity

n
 n(n1)...(ni+1)
I Recall that i = i(i1)...(1) . This implies a simple
n n1
 
but important identity: i i = n i1 .
I Using this identity (and q = 1 p), we can write
n   n  
X n i ni X n 1 i ni
E [X ] = i pq = n pq .
i i 1
i=0 i=1
Useful Pascals triangle identity

n
 n(n1)...(ni+1)
I Recall that i = i(i1)...(1) . This implies a simple
n n1
 
but important identity: i i = n i1 .
I Using this identity (and q = 1 p), we can write
n   n  
X n i ni X n 1 i ni
E [X ] = i pq = n pq .
i i 1
i=0 i=1
Pn n1
p (i1) q (n1)(i1) .

I Rewrite this as E [X ] = np i=1 i1
Useful Pascals triangle identity

n
 n(n1)...(ni+1)
I Recall that i = i(i1)...(1) . This implies a simple
n n1
 
but important identity: i i = n i1 .
I Using this identity (and q = 1 p), we can write
n   n  
X n i ni X n 1 i ni
E [X ] = i pq = n pq .
i i 1
i=0 i=1
Pn n1
p (i1) q (n1)(i1) .

I Rewrite this as E [X ] = np i=1 i1
I Substitute j = i 1 to get
n1  
X n1
E [X ] = np p j q (n1)j = np(p + q)n1 = np.
j
j=0
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
I Think of X as representing number of heads in n tosses of
coin that is heads with probability p.
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
I Think of X as representing number of heads in n tosses of
coin that is heads with probability p.
Write X = nj=1 Xj , where Xj is 1 if the jth coin is heads, 0
P
I
otherwise.
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
I Think of X as representing number of heads in n tosses of
coin that is heads with probability p.
Write X = nj=1 Xj , where Xj is 1 if the jth coin is heads, 0
P
I
otherwise.
I In other words, Xj is the number of heads (zero or one) on the
jth toss.
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
I Think of X as representing number of heads in n tosses of
coin that is heads with probability p.
Write X = nj=1 Xj , where Xj is 1 if the jth coin is heads, 0
P
I
otherwise.
I In other words, Xj is the number of heads (zero or one) on the
jth toss.
I Note that E [Xj ] = p 1 + (1 p) 0 = p for each j.
Decomposition approach to computing expectation

I Let X be a binomial random variable with parameters (n, p).


Here is another way to compute E [X ].
I Think of X as representing number of heads in n tosses of
coin that is heads with probability p.
Write X = nj=1 Xj , where Xj is 1 if the jth coin is heads, 0
P
I
otherwise.
I In other words, Xj is the number of heads (zero or one) on the
jth toss.
I Note that E [Xj ] = p 1 + (1 p) 0 = p for each j.
I Conclude by additivity of expectation that
n
X n
X
E [X ] = E [Xj ] = p = np.
j=1 j=1
Interesting moment computation
I Let X be binomial (n, p) and fix k 1. What is E [X k ]?
Interesting moment computation
I Let X be binomial (n, p) and fix k 1. What is E [X k ]?
Recall identity: i ni = n n1
 
i1 .
I
Interesting moment computation
I Let X be binomial (n, p) and fix k 1. What is E [X k ]?
Recall identity: i ni = n n1
 
i1 .
I

I Generally, E [X k ] can be written as


n  
X n i
i p (1 p)ni i k1 .
i
i=0
Interesting moment computation
I Let X be binomial (n, p) and fix k 1. What is E [X k ]?
Recall identity: i ni = n n1
 
i1 .
I

I Generally, E [X k ] can be written as


n  
X n i
i p (1 p)ni i k1 .
i
i=0

I Identity gives
n  
k
X n 1 i1
E [X ] = np p (1 p)ni i k1 =
i 1
i=1

n1  
X n1 j
np p (1 p)n1j (j + 1)k1 .
j
j=0
Interesting moment computation
I Let X be binomial (n, p) and fix k 1. What is E [X k ]?
Recall identity: i ni = n n1
 
i1 .
I

I Generally, E [X k ] can be written as


n  
X n i
i p (1 p)ni i k1 .
i
i=0

I Identity gives
n  
k
X n 1 i1
E [X ] = np p (1 p)ni i k1 =
i 1
i=1

n1  
X n1 j
np p (1 p)n1j (j + 1)k1 .
j
j=0

I Thus E [X k ]
= npE [(Y + 1)k1 ] where Y is binomial with
parameters (n 1, p).
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
I We computed identity E [X k ] = npE [(Y + 1)k1 ] where Y is
binomial with parameters (n 1, p).
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
I We computed identity E [X k ] = npE [(Y + 1)k1 ] where Y is
binomial with parameters (n 1, p).
I In particular E [X 2 ] = npE [Y + 1] = np[(n 1)p + 1].
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
I We computed identity E [X k ] = npE [(Y + 1)k1 ] where Y is
binomial with parameters (n 1, p).
I In particular E [X 2 ] = npE [Y + 1] = np[(n 1)p + 1].
I So Var[X ] = E [X 2 ] E [X ]2 = np(n 1)p + np (np)2 =
np(1 p) = npq, where q = 1 p.
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
I We computed identity E [X k ] = npE [(Y + 1)k1 ] where Y is
binomial with parameters (n 1, p).
I In particular E [X 2 ] = npE [Y + 1] = np[(n 1)p + 1].
I So Var[X ] = E [X 2 ] E [X ]2 = np(n 1)p + np (np)2 =
np(1 p) = npq, where q = 1 p.
I Commit to memory: variance of binomial (n, p) random
variable is npq.
Computing the variance

I Let X be binomial (n, p). What is E [X ]?


I We know E [X ] = np.
I We computed identity E [X k ] = npE [(Y + 1)k1 ] where Y is
binomial with parameters (n 1, p).
I In particular E [X 2 ] = npE [Y + 1] = np[(n 1)p + 1].
I So Var[X ] = E [X 2 ] E [X ]2 = np(n 1)p + np (np)2 =
np(1 p) = npq, where q = 1 p.
I Commit to memory: variance of binomial (n, p) random
variable is npq.
I This is n times the variance youd get with a single coin.
Coincidence?
Compute variance with decomposition trick

X = nj=1 Xj , so
P
I

E [X 2 ] = E [ ni=1 Xi nj=1 Xj ] = ni=1 nj=1 E [Xi Xj ]


P P P P
Compute variance with decomposition trick

X = nj=1 Xj , so
P
I

E [X 2 ] = E [ ni=1 Xi nj=1 Xj ] = ni=1 nj=1 E [Xi Xj ]


P P P P

I E [Xi Xj ] is p if i = j, p 2 otherwise.
Compute variance with decomposition trick

X = nj=1 Xj , so
P
I

E [X 2 ] = E [ ni=1 Xi nj=1 Xj ] = ni=1 nj=1 E [Xi Xj ]


P P P P

I E [Xi Xj ] is p if i = j, p 2 otherwise.
Pn Pn
j=1 E [Xi Xj ] has n terms equal to p and (n 1)n
I
i=1
terms equal to p 2 .
Compute variance with decomposition trick

X = nj=1 Xj , so
P
I

E [X 2 ] = E [ ni=1 Xi nj=1 Xj ] = ni=1 nj=1 E [Xi Xj ]


P P P P

I E [Xi Xj ] is p if i = j, p 2 otherwise.
Pn Pn
j=1 E [Xi Xj ] has n terms equal to p and (n 1)n
I
i=1
terms equal to p 2 .
I So E [X 2 ] = np + (n 1)np 2 = np + (np)2 np 2 .
Compute variance with decomposition trick

X = nj=1 Xj , so
P
I

E [X 2 ] = E [ ni=1 Xi nj=1 Xj ] = ni=1 nj=1 E [Xi Xj ]


P P P P

I E [Xi Xj ] is p if i = j, p 2 otherwise.
Pn Pn
j=1 E [Xi Xj ] has n terms equal to p and (n 1)n
I
i=1
terms equal to p 2 .
I So E [X 2 ] = np + (n 1)np 2 = np + (np)2 np 2 .
I Thus
Var[X ] = E [X 2 ] E [X ]2 = np np 2 = np(1 p) = npq.
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
Outline

Bernoulli random variables

Properties: expectation and variance

More problems
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
P205 205
 j 205j
j .95 .05
I
j=201
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
P205 205
 j 205j
j .95 .05
I
j=201
I In a 100 person senate, forty people always vote for the
Republicans position, forty people always for the Democrats
position and 20 people just toss a coin to decide which way to
vote. What is the probability that a given vote is tied?
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
P205 205
 j 205j
j .95 .05
I
j=201
I In a 100 person senate, forty people always vote for the
Republicans position, forty people always for the Democrats
position and 20 people just toss a coin to decide which way to
vote. What is the probability that a given vote is tied?
20
 20
10 /2
I
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
P205 205
 j 205j
j .95 .05
I
j=201
I In a 100 person senate, forty people always vote for the
Republicans position, forty people always for the Democrats
position and 20 people just toss a coin to decide which way to
vote. What is the probability that a given vote is tied?
20
 20
10 /2
I

I You invite 50 friends to a party. Each one, independently, has


a 1/3 chance of showing up. What is the probability that
more than 25 people will show up?
More examples

I An airplane seats 200, but the airline has sold 205 tickets.
Each person, independently, has a .05 chance of not showing
up for the flight. What is the probability that more than 200
people will show up for the flight?
P205 205
 j 205j
j .95 .05
I
j=201
I In a 100 person senate, forty people always vote for the
Republicans position, forty people always for the Democrats
position and 20 people just toss a coin to decide which way to
vote. What is the probability that a given vote is tied?
20
 20
10 /2
I

I You invite 50 friends to a party. Each one, independently, has


a 1/3 chance of showing up. What is the probability that
more than 25 people will show up?
P50 50
 j 50j
j=26 j (1/3) (2/3)
I

You might also like