Covariance and Some Conditional Expectation Exercises: Scott Sheffield
Covariance and Some Conditional Expectation Exercises: Scott Sheffield
600: Lecture 24
Covariance and some conditional
expectation exercises
Scott Sheffield
MIT
Outline
E [XY ] X E [Y ] Y E [X ] + X Y = E [XY ] E [X ]E [Y ].
Defining covariance and correlation
E [XY ] X E [Y ] Y E [X ] + X Y = E [XY ] E [X ]E [Y ].
I Covariance formula E [XY ] E [X ]E [Y ], or expectation of
product minus product of expectations is frequently useful.
Defining covariance and correlation
E [XY ] X E [Y ] Y E [X ] + X Y = E [XY ] E [X ]E [Y ].
I Covariance formula E [XY ] E [X ]E [Y ], or expectation of
product minus product of expectations is frequently useful.
I Note: if X and Y are independent then Cov(X , Y ) = 0.
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
I Cov(X , X ) = Var(X )
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
I Cov(X , X ) = Var(X )
I Cov(aX , Y ) = aCov(X , Y ).
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
I Cov(X , X ) = Var(X )
I Cov(aX , Y ) = aCov(X , Y ).
I Cov(X1 + X2 , Y ) = Cov(X1 , Y ) + Cov(X2 , Y ).
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
I Cov(X , X ) = Var(X )
I Cov(aX , Y ) = aCov(X , Y ).
I Cov(X1 + X2 , Y ) = Cov(X1 , Y ) + Cov(X2 , Y ).
I General statement of bilinearity of covariance:
Xm n
X m X
X n
Cov( ai Xi , bj Yj ) = ai bj Cov(Xi , Yj ).
i=1 j=1 i=1 j=1
Basic covariance facts
I Using Cov(X , Y ) = E [XY ] E [X ]E [Y ] as a definition,
certain facts are immediate.
I Cov(X , Y ) = Cov(Y , X )
I Cov(X , X ) = Var(X )
I Cov(aX , Y ) = aCov(X , Y ).
I Cov(X1 + X2 , Y ) = Cov(X1 , Y ) + Cov(X2 , Y ).
I General statement of bilinearity of covariance:
Xm n
X m X
X n
Cov( ai Xi , bj Yj ) = ai bj Cov(Xi , Yj ).
i=1 j=1 i=1 j=1
I Special case:
Xn n
X X
Var( Xi ) = Var(Xi ) + 2 Cov(Xi , Xj ).
i=1 i=1 (i,j):i<j
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
I Correlation doesnt care what units you use for X and Y . If
a > 0 and c > 0 then (aX + b, cY + d) = (X , Y ).
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
I Correlation doesnt care what units you use for X and Y . If
a > 0 and c > 0 then (aX + b, cY + d) = (X , Y ).
I Satisfies 1 (X , Y ) 1.
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
I Correlation doesnt care what units you use for X and Y . If
a > 0 and c > 0 then (aX + b, cY + d) = (X , Y ).
I Satisfies 1 (X , Y ) 1.
I Why is that? Something to do with E [(X + Y )2 ] 0 and
E [(X Y )2 ] 0?
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
I Correlation doesnt care what units you use for X and Y . If
a > 0 and c > 0 then (aX + b, cY + d) = (X , Y ).
I Satisfies 1 (X , Y ) 1.
I Why is that? Something to do with E [(X + Y )2 ] 0 and
E [(X Y )2 ] 0?
I If a and b are constants and a > 0 then (aX + b, X ) = 1.
Defining correlation
Cov(X , Y )
(X , Y ) := p .
Var(X )Var(Y )
I Correlation doesnt care what units you use for X and Y . If
a > 0 and c > 0 then (aX + b, cY + d) = (X , Y ).
I Satisfies 1 (X , Y ) 1.
I Why is that? Something to do with E [(X + Y )2 ] 0 and
E [(X Y )2 ] 0?
I If a and b are constants and a > 0 then (aX + b, X ) = 1.
I If a and b are constants and a < 0 then (aX + b, X ) = 1.
Important point