0% found this document useful (0 votes)
2 views

ECE316 Notes 2

Uploaded by

zpkbtbsffg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

ECE316 Notes 2

Uploaded by

zpkbtbsffg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 25

ECE 316 – Lecture Notes Chapters 4-7

Chapter 4: Discrete Random Variables


Random variables are real-valued functions of outcomes defined within the same sample space.

Example: Let Y denote the number of heads of 3 tossed coins. Then, Y is a random variable with values:

{
1
0 , p=
8
3
1 , p=
Y= 8
3
2 , p=
8
1
3 , p=
8

Example: Independent trials of flipping a coin with probability p of getting heads, are performed until
heads occurs. Let X denote the number of flips.

P { X=1 }= p P {X =2 }=( 1− p ) p P {X =3 }=( 1− p )2 p

So P ( X=n )=( 1−p )n−1 p

Discrete Random Variables


A random variable that can take on at most a countable number of possible values

Probability Mass Function


The probability mass function (PMF) of a discrete random variable X is the probability that X is
equal to that value:

p ( a ) =P { X=a }

i

Example: The PMF of a random variable X is given by p(i)= , where i=0 , 1 , 2, … and λ is a positive
i!
number. Find:

a) P { X=0 }
∞ ∞ i
∑ p ( x i )=1 ⟹ ∑ ci λ! =1
i=0 i=0

1 1
c= ∞ i
= λ
=e−λ
e
∑ i!λ
i=1
−λ 0
e λ
P { X=0 }= p ( 0 ) =
−λ
=e
0!
b) P {X > 2}

( )
2
e− λ λi λ
2
P { X >2 } =1− p ( 0 )− p ( 1 )− p ( 2 ) ¿ 1−∑ −λ
¿ 1−e 1+ λ+
i=0 i! 2!

Cumulative Distribution Function


The cumulative distribution function (CDF) of a discrete random variable X is the sum of all the
PMFs p(a) for x less than or equal to a value a :

F ( a )=P {X ≤ a }=∑ p( a)
x ≤a

Indicator Variable
An indicator variable for an event A is 1 if the event occurs, 0 otherwise:

I A=
{01ifif AA occurs
C
occurs

Expectation
The expectation/expected value of a discrete random variable X with PMF p(x ) is the sum of each
value of X multiplied with its corresponding PMF:

E [X ]= ∑ x p( x )
x: p ( x ) >0

If X is a random variable, then so is any function of it, g( X) . The expectation of g( X) is:

E [g( X)]= ∑ g ( x) p (x)


x : p ( x ) >0

{
1
0 , p=
3
Example: X =
2
1, p=
3

1 2 2
E [ X ] =0 × +1 × = Example: A product sold seasonally yields a net profit of b dollars for each unit
3 3 3
sold and a net loss of l dollars for each unit left unsold when the season ends. The number of units of
the product that are ordered by customers is a random variable having PMF p ( i ) , i≥ 0 .
Determine the number of units the store should stock, to maximize its expected profit.

Let X denote the number of units ordered by customers.

Let s denote the number of units to be stocked.

The profit Q ( s , X )= { bs , s ≤ X
bX−l ( s− X ) , s> X

E [ Q ( S , X ) ]= ∑ Q ( s , x ) p ( x )¿
∑ Q ( s ,i ) p(i)
x: p ( x ) >0
i=0

Discrete Integer Optimization


Want to check if E [ Q ( s +1 , X ) ] −E [ Q ( s , X ) ] ≶ 0

∞ s−1 ∞ s−1 ∞
E [ Q ( S , X ) ] =∑ Q ( s ,i ) p (i ) ¿ ∑ [bi−l ( s−i )] p ( i )+ ∑ bs p ( i )¿ ∑ [bi+li−ls] p ( i ) + ∑ bs p ( i )
i=0 i=0 i=s i=0 i =s

( )
s−1 s−1 ∞ s−1 s−1 s−1
¿ ( b+l ) ∑ ip (i ) −sl ∑ p ( i )+ sb ∑ p ( i )¿ ( b+l ) ∑ ip (i ) −ls ∑ p ( i ) +bs 1−∑ p ( i )
i=0 i=0 i=s i=0 i=0 i=0
s−1 s−1 s−1
¿ ( b+l ) ∑ ip (i ) −(b +l)s ∑ p ( i ) +bs¿ ( b+l ) ∑ (i−s ) p ( i )+ bs
i=0 i=0 i=0

[ ][ ]
s s−1
E [ Q ( s +1 , X ) ] −E [ Q ( s , X ) ]¿ ( b +l ) ∑ (i−s−1 ) p ( i ) +b ( s+1 ) − ( b +l ) ∑ ( i−s ) p ( i ) +bs
i=0 i=0

[ ][ ]
s s−1
¿ ( b +l ) ∑ (i−s−1 ) p ( i ) +b ( s+1 ) − ( b +l ) ∑ ( i−s ) p ( i ) +bs
i=0 i=0

[ ][ ]
s s s−1 s
¿ ( b +l ) ∑ (i−s ) p ( i )−(b+l) ∑ p ( i ) +bs+ b − ( b+l ) ∑ ( i−s ) p ( i )+ bs ¿ b−(b+l) ∑ p ( i )
i=0 i=0 i=0 i=0

We want to know when E [ Q ( s +1 , X ) ] −E [ Q ( s , X ) ] ≶ 0


s
b−(b+l) ∑ p ( i ) ≶0
i=0

s s

∑ p ( i ) ≶ b+b l ,∧∑ p ( i ) is the CDF F (s)


i=0 i=0

b
Hence, the optimal profit occurs when F ( s )=
b+l
Corollary:
For a random variable X , if a and b are constants then:

E [ aX +b ] =aE [ X ] +b
Proof:

E [ aX +b ] = ∑ ( ax+ b ) p ( x )¿ a ∑ xp ( x )+ b ∑ p ( x )¿ aE [ X ]+b (1)¿ aE [ X ]+b


x : p ( x ) >0 x: p ( x ) >0 x : p ( x ) >0

The expectation of X , E [X ] is known as the first moment.


n
E [X ] is the nth moment.
Variance
For a random variable X , the variance is how far away the results are from the expectation:

Var ( X )=E [ ( X−μ ) ] , μ=E [ X ]


2

Var ( X )=E [ X 2 ]−μ2


For constants a and b :

Var ( aX +b )=Var (aX) – b doesn’t affect variance since it is just a shift

Var ( aX )=E [ ( aX )2 ]−( E [ aX ]) ¿ a 2 E [ X 2 ]−a2 E [ X ] ¿ a 2 Var ( X)


2 2

Example: Calculate Var ( X) if X is the outcome of a die.

{
1
2
1
X = 3 , p=
4 6
5
6

1 1 1 1 1 1 7
E [ X ] =1 × +2 × + 3× + 4 × +5 × + 6 × =
6 6 6 6 6 6 2
1 2 1 2 1 2 1 2 1 2 1 91
E [ X ]=1 × +2 × +3 × + 4 × +5 × +6 × =
2 2
6 6 6 6 6 6 6

()
2
91 7 35
Var ( X )=E [ X 2 ]−μ2¿ − ¿
6 2 12

Common Discrete Random Variables


Bernoulli Random Variable
Success or failure

X= {0 1, with
, with probability p
probability 1− p

E [ X ]= p
2
Var ( X )= p− p = p(1− p)
Binomial Random Variable (n , p)
Number of successes in n independent trials, each with success probability p

X ={0 , 1 , 2, … , n }
n

i
i
() n−1

i=0 i
()
PMF : P { X =i }= n p ( 1− p ) E [ X ] =∑ n p i ( 1−p )n−1=np
Geometric Random Variables
Recall: Geometric series

a
∑ a0 r i= 1−r0
i=0

Geometric Random Variable


Independent trials until a success occurs, where the probability of success is p. The number of trials
that it takes to get a success is a geometric random variable X .
i−1
PMF : P { X =i }= p ( i )=( 1− p ) p
∞ ∞ ∞
1 1
CDF :F ( i )=∑ p ( i )=∑ ( 1− p ) p= p ∑ ( 1− p ) = p
i −1 i −1
= p =1
i=0 i=0 i=0 1−( 1− p ) p

∞ ∞
E [ X ] =∑ i p (i ) =∑ i ( 1− p )
i−1
p
i=0 i=0

d i i−1
Note that q =i q
dq

( )
d

d p d p p 1
E [ X ] = ∑ q i p¿ ¿ ( p (1−q ) )¿−p (1−q )−2 (−1 ) ¿ ¿
−1
2¿
dq i=0 dq 1−q dq ( 1−q ) ( 1−( 1− p ) q ) p
2
Chapter 5: Continuous Random Variables
Continuous random variables take on an uncountable number of values

Probability Density Function


The probability density function (PDF) of a continuous random variable X is the probability that X
belongs to a region B∈ R:

P { X ∈ B }=∫ f ( x ) dx
B

Common cases:
b
If B=[ a , b ] , a< b: P { X ∈ B }=∫ f ( x ) dx
a

a
If B=[ a , a ] : P { X ∈ B }=∫ f ( x ) dx=0
a


If B=[ −∞ , ∞ ] : P { X ∈ B }= ∫ f ( x ) dx=1
−∞

Example: For a random variable X with PDF f ( x )=


{
C ( 4 x−2 x 2 ) ,0 < x< 2
0 , otherwise

a) What is the value of C ?

∫ f ( x ) dx=1∫ C ( 4 x−2 x 2 ) dx=1C (2 x2− 3 x3 )∨20 =1C (2 ( 2 )2− 3 ( 2 )3)=1C= 8


∞ 2
2 2 3
−∞ 0

b) Find P { X >1 } .

(( ) ) [( )( )]
2
3 3 2 3 3 2 2 1
P { X >1 }=∫ ( 4 x −2 x ) dx¿ 2 x − x ∨2 ¿ 2 ( 2 )2− ( 2 )3 − 2 ( 1 )2− (1 )3 ¿
2 2

1 8
8 3 1 8 3 3 2
Cumulative Distribution Function
The cumulative distribution function (CDF) of a continuous random variable X is the probability that
X is less than or equal to a value a :
a
F ( a )=P { X ≤ a }= ∫ f ( x ) dx
−∞

a
d d
F ( a )= ∫ f ( x ) dx=f ( a)
da da −∞

Example: If X is a continuous random variable with CDF F X and PDF f X , find the probability density
function of Y =2 X .
x y
F X ( x )=P { X ≤ x }= ∫ f X ( s ) ds , FY ( y )=P { Y ≤ y }=∫ f Y ( s ) ds
−∞ −∞

{ }
y /2
y
F Y ( y )=P { Y ≤ y }¿ P { 2 X ≤ y }¿ P X ≤ ¿ ∫ f X ( s ) ds
2 −∞

f Y ( y )=
d
dy
1
F Y ( y )¿ f X
2
y
2 ()

Expectation
The expectation/expected value of a continuous random variable X is:
In general,∞by chain rule:
E [ X ] = ∫ xf ( x ) dx
β (x)
d −∞ d

da −∞
f ( s ) ds=f ( β ( x ) )
dx
β (x)

Example: Find E [X ] if f ( x )= {02,xotherwise


, 0 ≤ x ≤1

∞ 1
2 3 1 2
E [ X ] = ∫ xf ( x ) dx¿ ∫ x ( 2 x ) dx¿ x ∨ ¿
−∞ 0
3 0 3
Example: Find E [ e X ] for f ( x )= {01, otherwise
, 0≤ x ≤ 1

∞ 1
1
E [ e ] =∫ e f ( x ) dx¿ ∫ e dx ¿ e ∨ ¿ e−1
X x x x

−∞ 0
0

Uniform Random Variables


A random variable is uniformly distributed over (α , β ) if:

{
1
,α ≤x ≤β
PDF :f ( x )= β−α
0 , otherwise

{
x
0 , x< α
CDF : F ( x )= ∫ f ( s ) ds= x−α
,α ≤ x≤ β
−∞ β−α
1,x>β

α +β
Expectation : E [ X ]=
2

( β−α )2
Variance :Var ( X )=
12
Example: A bus arrives at a stop at 7, 7:15, 7:30, 7:45, and so on. If a passenger arrives at the stop at a
time that is uniformly distributed between 7 and 7:30, find the probability that he waits:

a) Less than 5 minutes for a bus

Let X denote the time that the passenger arrives at the stop.

In order to wait less than 5 minutes for a bus, the passenger needs to arrive between 7:10 and
7:15 or between 7:25 and 7:30.

So, we want to find P { 7 :10 ≤ X ≤ 7 :15 } + P {7 :25 ≤ X ≤7 :30 }.


7 :15 7 :30
P { wait ≤ 5 minutes }= ∫ f ( x ) dx + ∫ f ( x ) dx
7: 10 7 :25

1 1 1
¿ ( 5 minutes ) + ( 5 minutes )¿
30 minutes 30 minutes 3

b) More than 10 minutes for a bus

Let X denote the time that the passenger arrives at the stop.

In order to wait more than 10 minutes for a bus, the passenger needs to arrive between 7:00
and 7:05 or between 7:15 and 7:20.
So, we want to find P { 7 :00< X ≤ 7 :05 } + P {7 :15< X ≤7 :20 }.
7 :05 7 :20
P { wait ≥ 10 minutes } = ∫ f ( x ) dx + ∫ f ( x ) dx
7 :00 7 :15

1 1 1
¿ ( 5 minutes ) + ( 5 minutes )¿
30 minutes 30 minutes 3

Normal/Gaussian Random Variable


X is a normal/Gaussian random variable N ( μ , σ 2 ) if its PDF is:
2
− ( x−μ)
1 2σ
2

f ( x )= e ,−∞< x < ∞
√2 π σ
Expected value: E [ X ] =μ
2
Variance :Var ( X )=σ

If X is normally distributed with (μ , σ 2 ), then Y =aX +b is normally distributed with (aμ +b , a2 σ 2 ).

For the standard normal distribution, X N (0 , 1):


2
−x
1
PDF :f ( x )= e 2

√2 π
2
x −y
1
CDF :ϕ ( x )= ∫ e 2
dy
−∞ √2 π
For a standard normal distribution, ϕ (−x )=1−ϕ ( x ), since P { X ≤−x }=P { X ≥ x }

Expected value: E [ X ] =0
Variance :Var ( X )=1

For the general case, Z N (μ , σ 2 ):

Given a random variable Y =σX + μ, where X N (0 , 1), Y N ( σ ( 0 ) + μ , σ 2 ( 1 )2 )=N ( μ , σ 2 ) =Z

Proof:

F Y ( y )=P { Y ≤ y }=P { σX + μ ≤ y }=P X ≤ { y −μ


σ
=F X} ( )
y −μ
σ
2
−( y−μ)

( y−μ 1
)
1 2

f Y ( y )=f X = e 2σ
σ σ √2 π σ
2
x −(y− μ)

CDF :F ( x )=∫
1
( y−μ
σ )
2

e dy=ϕ
−∞ √2 π σ
Example: Gaussian noise in a communicator

Let X be a random variable such that for bit 1, X =2 is transmitted and for bit 0, X =−2 is transmitted.

Let Y be the received value: Y = X +Z , Z N (0 , 1). The received value is decoded into bit 1 if Y >0.5
and bit 0 if Y <0.5 .

P { error|message is 1 }=P {2+ Z ≤ 0.5 }=P { Z ≤−1.5 } =ϕ (−1.5 )=1−ϕ (1.5)


P { error|message is 0 }=P {−2+Z ≥ 0.5 }=P { Z ≥ 2.5 }=1−P { Z ≤2.5 }=1−ϕ(2.5)

Exponential Random Variables


An exponential random variable with parameter λ> 0 has:

{
− λx
PDF :f ( x )= λ e , x ≥ 0
0 , x <0

CDF :F ( x )=1−e−λx for x ≥ 0


1 1
E [ X ] = ,Var ( X )= 2
λ λ

The Gamma Distribution


A gamma distribution with parameters (α , λ)> 0 has:

{
λ e−λx ( λx )α −1
PDF :f ( x )= ,x ≥0
Γ (α )
0 , x <0

Where Γ ( α )=∫ e
−y α −1
y dy for α =n , Γ ( n )= ( n−1 ) !
0

α α
E [ X ] = , Var ( X )= 2
λ λ
Example: In queuing systems,

 The total number of customers arriving is Poisson( λ)


 The number of customers arriving at a time t 6 is Gamma(6 , λ)
 The number of customers in an inter-arrival interval t i−t i−1 is Exponential (λ)
Chapter 6: Jointly Distributed Random Variables
Joint Distribution Functions
The joint cumulative distribution function (JCDF) of two random variables X and Y is the probability
that the two random variables are less than certain values:

F X ,Y ( a , b ) =P{ X ≤ a ,Y ≤ b }

In the case where one value is infinite:

F X ,Y ( a , ∞ )=P { X ≤ a , Y ≤ ∞ }¿ F X (a)

F X ( a ) is the marginal distribution function of X

P { X >a ,Y > b }=1−F X ,Y ( a , ∞ )−F X ,Y ( ∞ ,b )+ F X ,Y ( a , b )¿ 1−F X ( a )−FY ( b ) + F X , Y ( a , b )

P { a1 < X <a2 , b1 <Y <b 2 }=F X , Y ( a 2 , b2 ) −F X ,Y ( a1 , b2 )−F X , Y ( a2 , b 1) + F X ,Y ( a1 ,b 1 )

Joint Probability Mass Function (Discrete)


The joint probability mass function (JPMF) of two discrete random variables X and Y is the
probability that the two random variables take on certain values:

p X ,Y ( x , y )=P {X =x , Y = y }

p X ( x )=P { X =x }¿ ∑ p X , Y (x , y)
y : pX ,Y ( x , y ) > 0

pY ( y )=P {Y = y }¿ ∑ p X , Y (x , y)
y : pX ,Y ( x , y ) > 0

p X ( x ) and pY ( y ) are the marginal PMFs of X and Y

Example: 3 balls are selected from 3 red, 4 white, and 5 blue. Let X and Y denote, respectively, the
number of red and white balls chosen. Then the JPMF of X and Y is

p X ,Y ( x , y )=P { X=x , Y = y }

p X ,Y ( 0 , 0 )=
( 3)
5

(123)

Two continuous random variables X and Y are jointly continuous if there exists a joint probability
density function (JPDF) f X , Y ( x , y ) such that, for every set C ∈ R2:

P { ( X , Y ) ∈ C }= ∬ f X ,Y ( x , y ) dxdy
p X ,Y ( 1 , 0 )=
( 1)( 2)
3 5

(123)
Joint Probability Density Function (Continuous)

Similarly,
∞ ∞
P { Y ∈ B }=∫ f Y ( y ) dy=∫ ∫ f X ,Y (x , y)dx dy ⟹ f Y ( y )= ∫ f X , Y (x , y)dx
B B −∞ −∞

Joint Cumulative Density Function


The joint cumulative density function (JCDF) of two continuous random variables X and Y is the
probability that the two random variables are less than certain values:

F X ,Y ( a , b ) =P{ X ≤ a ,Y ≤ b }

For example, if A=¿ and B=¿ :


b a
F X ,Y ( a , b ) =P { X ≤ a , Y ≤ b }¿ ∫ ∫ f X ,Y ( x , y ) dx dy
−∞ −∞

Conversely,
2
∂ F X, Y (a , b )
f X , Y ( a , b )=
∂a∂b
¿ adfsdf

{
−x −2 y
Example: The JPDF of X and Y is f ( x , y )=
2e e , 0< x <∞ , 0< y <∞
0 , otherwise

a) P { X >1 , Y <1 }
(∫ )
1 ∞ 1 ∞ 1 ∞

P { X >1 , Y <1 }= ∫ ∫ f X ,Y ( x , y ) dx dy¿ ∫ ∫ 2 e e dx dy¿ 2∫ e


−x −2 y −2 y −x
e dx dy
−∞ 1 0 1 0 1
1 1
−1 −2 y 1 −1 −2 1 −1
¿ 2∫ e
(
−e ∨ ∞ dy¿ 2 e
) ∫ e−2 y dy ¿ 2 e
−1
−2 y −x −1
( e ∨ )¿ 2 e−1 ( e + )¿ e (1−e−2)
0 1 0
2 0 2 2

b) P {X < Y }

P { X <Y }=∬ f X , Y ( x , y ) dxdy


C

Ranges: x : (−∞ ,∞ ) ∧ y :( x ,∞ ) or y : (−∞ , ∞ )∧x :(−∞ , y)

(∫ )
∞ ∞ ∞ ∞ ∞ ∞

P { X <Y }=∫ ∫ f X , Y ( x , y ) dx dy¿ ∫ ∫ 2 e e dy dx¿ 2∫ e


− x −2 y −x −2 y
e dy dx
x −∞ 0 0∨x 0 x
∞ ∞ ∞
¿ 2∫ e
0
−x
( −1 −2 y ∞
2 x 0
−x
2 )
1 −2 x
e ∨ dx¿ 2∫ e 0+ e
0
(
−3 x 1 −3 x
dx¿ ∫ e dx ¿− e ∨ ∞¿
3 0
1
3 )
{
−( x+ y)
Example: The joint density of X and Y is f ( x , y )=
e , 0< x <∞ , 0< y <∞
0 , otherwise
Find the density function of X/Y.

X
Let Z= . To solve this problem, first find F Z (a) then derive f Z (a).
Y

F Z ( a )=P { Z ≤ a }=P { YX ≤ a}=P {X ≤ aY }, a> 0

Ranges: x : (−∞ ,∞ ) ∧ y : ( ax , ∞) or y : (−∞ , ∞ )∧x :(−∞ , ay)


(∫ )
∞ ay ∞ ay ∞ ay

F Z ( a )=P { X ≤ aY }= ∫ ∫ f X ,Y ( x , y ) dx dy¿ ∫ ∫ e dx dy¿ ∫ e


− ( x+ y ) −y −x
e dx dy
−∞ −∞ 0 0 0 0
∞ ∞ ∞
1 − ( a+1 ) y ∞ 1
¿∫ e
( )
−e ∨ay dy¿ ∫ e (−e + 1 ) dy¿ ∫ e −e
−y −x −y −ay −y −(a+1 ) y −y
dy ¿−e + e ∨ ¿ 1−
0 0 0 0
a+1 0 a+1

d d 1
F Z ( a )¿ ( 1−( a+1 ) )¿ 0+ ( a+1 )−2¿
−1
f Z ( a )= , a> 0
da da ( a+1 )2

Independent Random Variables


Two random variables X and Y are independent if for any two sets of real numbers A and B,

P { X ∈ A , Y ∈ B }=P { X ∈ A } P {Y ∈ B }
Because this is true for any A and B, we only need to consider A=(−∞ , a ] , B=¿:

P { X ≤ a ,Y ≤ b }=P { X ≤ a } P { Y ≤b }

Then, two random variables X and Y are independent if:

F X ,Y ( a , b ) =F X ( a ) F Y (b)

For discrete random variables, this means:

p X ,Y ( x , y )= p X ( x) p Y ( y)

For continuous random variables, this means:

f X , Y ( x , y )=f X (x)f Y ( y )=adfsdf


Example: n+ m independent trials with a common success probability p. Let X be the number of
successes in the first n trials, Y be the number of successes in the last m trials, and Z be the total
number of successes in the n+ m trials.

a) Are X and Y independent?


Yes
b) Are X and Z independent?
No, since the value of X affects the distribution of Z (e.g. Z can take on the value 0 with some
probability, but if X > 0 then P {Z=0 }=0

Example: A man and a woman decide to meet at a certain location. If each person independently arrives
at a time uniformly distributed between 12pm and 1pm, find the probability that the first to arrive has
to wait longer than 10 minutes.

Let X be the arrival time of the man, and Y be the arrival time of the woman. Then they are both
uniformly distributed variables (0, 60).

P { wait >10 minutes }=P { X −Y >10 } + P { Y − X >10 }


Ranges: x : (10 ,60 )∧ y :(0 , x−10) or y : ( 10 , 60 )∧x :(0 , y−10)
60 x−10 60 x−10 60 x−10

( )
2
1
P { wait >10 minutes }=2 ∫ ∫ f X , Y ( x , y ) dy dx¿ 2∫ ∫ f X ¿ ¿ ¿¿ 2∫ ∫ dy dx
10 0 10 0 10 0 60

( )
60 x−10 60
1 1 1 1 2
¿ ∫ ∫ dy dx¿ ∫ (x −10)dx ¿ x −10 x ∨60
1800 10 0 1800 10 1800 2 10

1800 [ 2
( ] 1800 [ 2 ]
(60) −10 (60))−( ( 10) −10 (10)) ¿ ( (60) −10 (60))−( ( 10) −10 (10))
1 1 2 1 2 1 1 1 2 2
¿
2 2
25
¿
36
Sums of Independent Random Variables
For two independent random variables X and Y , with PDFs f X and f Y , the PDF of the sum of X and
Y is the convolution of the individual density functions:

f X +Y ( a )=f X∗f Y = ∫ f X ( x ) f Y ( a−x ) dx
−∞

Example: If X and Y are independent and both uniformly distributed (0,1), calculate the probability
density of X+Y.

{
X =Y = 1 , when0 < x , y <1
0 , otherwise
∞ 1 a−1
f X +Y ( a )=∫ f X ( x ) f Y ( a−x ) dx¿ ∫ f Y ( a−x ) dx¿ ∫ f Y ( y ) (−dy ) , where y=a−x
−∞ 0 a

{
0,a≤0
a a=1
a ,0< a ≤1
¿ ∫ f Y ( y ) dy , where ∫ f Y ( y ) dy =a−( a−1 )=1¿
a−1 a−1=0
2−a , 1<a ≤ 2
0 , a>2
a 0 a
Since for 0< a ≤1, ∫ f Y ( y ) dy= ∫ f Y ( y ) dy +∫ f Y ( y ) dy=[0]+[a−0 ]=a
a−1 a−1 0

a 1 a
And for 1<a ≤ 2, ∫ f Y ( y ) dy= ∫ f Y ( y ) dy +∫ f Y ( y ) dy= [1−( a−1 ) ]+ [ 0 ] =2−a
a−1 a−1 1

Gaussian Distributions
2
If X i N (μ , σ ) and they are all independent, then the parameters of the Gaussian distribution can
be summed:
n n n

∑ X i=N (∑ μ , ∑ σ 2)
i=1 i =1 i=1
Poisson Distributions
Binomial Distributions
Conditional Distributions: Discrete Case

Example: The JPMF of X and Y is p ( 0 , 0 )=0.4 , p ( 0 , 1 )=0.2, p ( 1 ,0 )=0.1 , p ( 1 , 1 )=0.3 . Calculate the
conditional PMF of X given that Y=1.
If X Binomial(n , p) and Y Binomial ( m , p ) are independent, then the first parameter of the
If X Poisson( λ1 ) and Y Poisson ( λ 2 ) are independent, then the parameters of the Poisson
If Binomial
X and Y distribution
are can be summed:
variables, then the conditional PMF of X given Y = y is:
distribution candiscrete random
be summed:
X +Y Binomial(n+ m, p)=adfsdf
p X+Y
X Poisson(λ1 + λ 2)=adfsdfP { X=x , Y = y } = p X , Y ( x , y )
∨Y ( x| y ) =P { X=x|Y = y }=
P {Y = y } pY ( y )

If X and Y are independent, then p X ∨Y ( x| y )= p X ( x)

If X and Y are discrete random variables, then the conditional CDF of X given Y = y is:

F X ∨Y ( x| y )=P { X ≤ x|Y = y }=∑ P { X=a∨Y = y }=∑ p X ∨Y ( a∨ y )


a≤ x a≤ x

p X , Y ( x ,1 )
p X ∨Y ( x∨1 )=
pY (1)

pY ( 1 ) =∑ p X , Y ( x , 1)=p ( 0 ,1 ) + p ( 1 ,1 )=0.2+0.3=0.5
x

p X ,Y ( 0 ,1 ) 0.2 2
p X ∨Y ( 0∨1 )= = =
p Y (1) 0.5 5

p X ,Y (1 , 1 ) 0.3 3
p X ∨Y ( 1 ,1 ) = = =
pY (1) 0.5 5
Conditional Distributions: Continuous Case
If X and Y are continuous random variables, then the conditional PDF of X given Y = y is:

P { X ∈ A|Y = y }=∫ f X ∨Y ( x ∨ y)dx


A

f X , Y (x , y )
f X∨Y ( x| y ) =
f Y ( y)

If X and Y are continuous random variables, then the conditional CDF of X given Y = y is:
a
F X ∨Y ( a| y )=P { X ≤ a|Y = y }= ∫ f X ∨Y (x∨ y )dx
−∞

a
d d
f X∨Y ( a| y )= F X ∨Y ( a| y )= ∫ f X ∨Y (x∨ y )dx
da da −∞

{
−x
e e− y y

Example: The JPDF of X and Y is f ( x , y )= ,0< x< ∞ ,0< y< ∞ . Find P { X >1|Y = y }.
y
0 , otherwise

∞ ∞
f X ,Y ( x , y )
P { X >1|Y = y }=P { X ∈ ( 1, ∞ )|Y = y }¿ ∫ f X ∨Y (x∨ y )dx¿ ∫ dx
1 1 f Y ( y)

( |)
∞ ∞ −x −x
e− y e− y ∞ =−e− y ( 0−1 )=e− y
f Y ( y )=∫ f X ,Y (x , y )dx=
y ∫
e y
dx= − ye y

0 0
y 0

( |) ( |)
−x ∞ −x
∞ −x −x
e y
e− y ¿ 1 ∫ e y 1 ∞ ¿ 1 −ye ∞ −1 −1
P { X >1|Y = y }=∫ dx¿ −ye y y y
−y
dx y y 1 y 1 ¿−1(0−e ) ¿ e y

1 ye 1

Chapter 7: Properties of Expectations


Expectations of Sums of Random Variables
E [ X 1 + X 2+ …+ X n ]= E [ X 1 ] + E [ X 2 ] +…+ E [ X n ]

E [ g ( X ) ] =∑ g ( x ) p ( x )¿ ∫ g ( x ) f ( x ) dx
x
−∞

∞ ∞
E [ g ( X , Y ) ] =∑ g ( x , y ) p ( x , y )¿ ∫ ∫ g ( x , y ) f ( x , y ) dxdy
x
−∞ −∞

Let X 1 + X 2+ …+ X n be independent and identically distributed (iid) random variables with the same
distribution function F .

The sample mean is used to estimate E [ X ] =μ :


n

∑ Xi
X = i =1
n

Example: Expectation of binomial random variable

Let X Binomial(n , p) . Find E [X ].

Xi= {0 1,1−, p p , for i=1 ,2 , … , n


E [ X ] =E [ X 1 ]+ E [ X 2 ]+ …+ E [ X n ] =p + p +…+ p=np

Covariance, Variance of Sums, and Correlations


If X and Y are independent, then for any functions g and h :

E [ g ( X ) h ( Y ) ] =E [ g ( X ) ] E[h ( Y ) ]

∞ ∞ ∞ ∞
E [ XY ] =E [ X ] E [ Y ] =∫ ∫ xyf ( x , y ) dxdy= ∫ x f X ( x ) dx ∫ y f Y ( y ) dy
−∞ −∞ −∞ −∞
The covariance between X and Y is:

Cov ( X , Y ) =E [ ( X− E [ X ] )( Y −E [ Y ] ) ]¿ E [ XY ] −E [ X ] E [ Y ]
¿ 0 if X ∧Y are independent−they are uncorrelated

Properties:

1. Cov ( X , Y ) =Cov ( Y , X )
2. Cov ( X , X )=Var (X )
3. Cov ( aX ,Y ) =aCov ( X , Y )

(∑ )
n m n m
4. Cov X i , ∑ Y j =∑ ∑ Cov( X i ,Y j)
i=1 j=1 i =1 j=1

Example: X Binomial(n , p)

X =X 1 + X 2 +…+ X n

(∑ ) ∑
n n
Var ( X )=Var Xi = Var ( X ¿¿ i)=np(1−p)¿
i=1 i=1

Conditional Expectations
The conditional expectation of X and Y is:

Discrete:

p X ,Y ( x , y )
E [ X∨Y = y ] =∑ x p X∨Y (x∨ y ), p X ∨Y ( x| y )=
x pY ( y )

Continuous:

f X ,Y ( x , y )
E [ X∨Y = y ] = ∫ x f X ∨Y (x∨ y )dx , f X ∨Y ( x| y )=
−∞ f Y ( y)

−x
e y
e− y
Example: Suppose f ( x , y )= , 0< x , y < ∞. Compute E [X ∨Y = y] .
y

|)
−x

(
∞ −y ∞ −x −x −y
∞ y
e− y ¿ e e− y ∞ ¿ e ( 0+ y ) − y
f Y ( y )= ∫ f X , Y ( x , y ) dx ¿ ∫ e
dx y ∫e y
dx¿ −ye y
¿e
−∞ y 0 y 0 y
0
−x

(∫ )
∞ ∞ ∞ −x −x −x
y 1
E [ X|Y = y ] =∫ x f X ∨Y ( x∨ y ) dx ¿ ∫ xe dx ¿ xe y
dx u=x → du=dx , dv=e y
→ v=− ye y

−∞ y y 0
0

( )
−x ∞ −x ∞ −x −x
1 ∞
¿ − xy e y ∨ −∫ − ye y
dx ¿ ∫ e y
dx ¿− y e y ∨∞ ¿ y
y 0 0 0 0

Computing Expectations by Conditioning


Sometimes it might be easier to compute the expectation of a random variable through its
conditional expectation:

E [ X ] =E [E [ X∨Y ]]

Example: A miner is trapped. What is the expected time to get out?

Let X be the expected time to get out. Let Y be the path picked; then Y ={1 ,2 , 3 }.

E [ X|Y =1 ] =2E [ X|Y =2 ] =5+ E [X ]E [ X|Y =3 ] =7+ E [ X ]

1 1 1
E [ X ] =E [ E [ X ∨Y ] ] E [X ]= E [ X|Y =1 ] + E [ X|Y =2 ] + E [X ∨Y =3 ]
3 3 3
1 1 1 2 5 1 7 1 1 2 5 7
E [X ]= (2)+ (5+ E [ X ] )+ (7+ E [ X ] )E [ X ] = + + E [ X ] + + E [ X ] E [ X ] = + +
3 3 3 3 3 3 3 3 3 3 3 3
E [ X ] =14
Moment-Generating Functions
The moment-generating function (MGF) is the expectation of e tX :

M ( t )=E [ etX ]
Discrete:

M ( t )=∑ e p (x)
tX

Continuous:

M ( t )= ∫ e f ( x ) dx
tX

−∞

The MGF generates moments easily:


'
M ( t ) ¿t=0 =E[ X ]

M ' ' ( t ) ¿t =0=E [ X 2 ] → Var ( X )=E [ X 2 ]− ( E[ X ] )


2

Example: Binomial(n , p)
n n

()
M ( t )=∑ e ti n pi (1− p ) ¿ ∑ n ( p et ) ( 1− p ) ¿ ( p e t +1− p ) ()
n−i i n−i n

i=0 i i=0 i

Example: Poisson ( λ )

t n
M ( t )=∑ e tn

e−λ λ n − λ ( λ e ) − λ λ e λ(e −1)

¿e ∑
t t

¿e e ¿e
n=0 n! n=0 n !

Example: Gaussian Z N (0 ,1)


2 2 2 2
∞ −x ∞ −x + 2tx ∞ −( x−t ) −t
1 1 1
M ( t )=
√ −∞
2 π
∫ etx e 2
dx¿ ∫e
√ π −∞
2
2
dx¿ ∫e
√2 π −∞
2
dx

( )
2
t2 ∞ − ( x−t )
1
2
t
¿e 2
∫e
√ 2 π −∞
2
dx → N ( t , 1 )=1¿ e 2

For X N ( μ , σ 2 ) , X=σZ+ μ
2 2
( tσ ) ( tσ )
M ( t )=E [ e ]¿ E [ e
tX t ( σZ +μ )
]¿ E [ e tσZ +tμ
]¿ e tμ
E [e tσZ
]¿ e tμ e 2
¿e 2
+tμ

Example: X Binomial ( n , p ) , Y Binomial ( m , p )


n m
M X ( t )=( p e +1− p ) , M Y ( t )=( p e +1−p ) s
t t

If X and Y are independent, then the MGF of X+Y is the product of their individual MGFs:

M X +Y ( t )=E [ e t ( X +Y ) ]¿ E [ e tX e tY ]¿ E [ e tX ] E [ etY ]¿ M X ( t ) M Y (t)


n +m
M X +Y ( t )=M X ( t ) M Y (t )=( p e +1− p )
t

Thus, X +Y Binomial(n+ m, p)

You might also like