CH-3d (Cov, Corr, & Indep)
CH-3d (Cov, Corr, & Indep)
1
The first step is to establish
that something is possible;
then probability will occur.
( Elon Musk )
2
CONTENTS:
3
§3.3.3 JOINT
CENTRAL MOMENTS
4
Joint Central Moments :
5
Joint Central Moments :
6
CO-VARIANCE
7
Covariance – Second Joint Central Moments :
𝑛 = 1; 𝑘 = 1
8
Recall : Properties of Expected Value :
9
Covariance – Second Joint Central Moments :
10
Covariance – Second Joint Central Moments :
12
Covariance – Second Joint Central Moments :
𝑪𝑿𝟏𝑿𝟐 = 𝝁𝟏𝟏
= 𝑿𝟏 𝑿𝟐 − 𝑿𝟐 . 𝑿𝟏 ⋯ ⋯ (𝟑. 𝟏𝟐𝟑)
= 𝑚11 − 𝑚𝑋1 𝑚𝑋2
= 𝑹𝑿𝟏𝑿𝟐 − 𝒎𝑿𝟏 𝒎𝑿𝟐
Covariance, Correlation, and first moments are related together!
RECALL: 𝝈𝟐𝑿 = 𝒎𝟐 − 𝒎𝟐𝟏 = 𝑿𝟐 − 𝑿 ഥ 𝟐
Variance and first two moments are related together! 13
Covariance – Second Joint Central Moments :
𝑪𝑿𝟏 𝑿𝟏 = 𝐸 𝑋1 − 𝑋1 𝑋1 − 𝑋1
2
= 𝐸 𝑋1 − 𝑋1
𝟐
= 𝝈𝑿𝟏
Covariance of a RV with
itself is the Variance of that RV.
14
Covariance – Second Joint Central Moments :
15
Covariance – Second Joint Central Moments :
16
Joint Central Moments :
18
Covariance – Second Joint Central Moments :
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝑿𝟏 𝑿𝟐 − 𝑿𝟐 . 𝑿𝟏 = 𝑹𝑿𝟏 𝑿𝟏 − 𝒎𝑿𝟏 𝒎𝑿𝟐
Covariance and Correlation are related together!
For independent RVs, Covariance ≡ 𝟎.
For un-correlated, Covariance ≡ 𝟎.
For orthogonal RVs, 𝑹𝑿𝟏𝑿𝟐 = 𝟎
𝑪𝑿𝟏 𝑿𝟐 = 𝟎 − 𝒎𝑿𝟏 𝒎𝑿𝟐 = −𝐸 𝑋1 𝐸 𝑋2
19
Joint Central Moments :
20
Covariance – Second Joint Central Moments :
23
Uncorrelatedness Vs Independence :
25
Ex. No. 5.23 [5] – Dependent but Uncorrelated RVs :
1
𝑓Θ 𝜃 = for 𝜃 = [0, 𝜋]
𝜋
28
Joint Central Moments :
29
Joint Central Moments :
30
Joint Central Moments :
32
PYTHAGOREAN THEOREM
OF STATISTICS
33
RECALL – Pythagorean Theorem of Geometry :
Pythagorean Theorem of
Statistics:
This theorem states that, for
INDEPENDENT RVs, the
square of the standard
deviation of their sum is the
sum of the squares of their
standard deviations.
35
Pythagorean Theorem of Statistics :
36
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum and Difference
%================================= m_X = 1.0006
N = 1e6; % No. of values of r.v. m_Y = 0.9992
X = 2 * rand(1,N); m_S = 1.9997
Y = 2 * rand(1,N); m_D = 0.0014
% X and Y are Independent RVs
S = X + Y;
var_X = 0.3330
D = X - Y;
%
var_Y = 0.3332
m_X = mean(X) var_S = 0.6652
m_Y = mean(Y) var_D = 0.6673
m_S = mean(S)
m_D = mean(D) cov_XY =
var_X = var(X) 0.3330 -0.0005
var_Y = var(Y) -0.0005 0.3332
var_S = var(S)
var_D = var(D)
SUM_C_XY = 0.6652
cov_XY = cov(X,Y)
SUM_C_XY = sum(sum(cov_XY)) 37
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum and Difference m_X = 0.9998
%=================================
m_Y = -5.0003
N = 1e6; % No. of values of r.v.
X = 2 * rand(1,N); m_S = -4.0005
Y = 2 * X - 7; m_D = 6.0002
% X and Y are Dependent RVs
S = X + Y; var_X = 0.3334
D = X - Y; var_Y = 1.3337
% var_S = 3.0009
m_X = mean(X) var_D = 0.3334
m_Y = mean(Y)
m_S = mean(S) 2 2 2
m_D = mean(D) 𝜎𝑋−𝑌 = 𝜎𝑋 + 𝜎𝑌 − 2𝐶𝑋𝑌 cov_XY =
0.3334 0.6669
var_X = var(X)
var_Y = var(Y) 0.6669 1.3337
var_S = var(S)
var_D = var(D) SUM_C_XY = 3.0009
cov_XY = cov(X,Y)
SUM_C_XY = sum(sum(cov_XY)) 38
Joint Central Moments :
2 2 2
𝜎𝑋−𝑌 = 𝜎𝑋 + 𝜎𝑌 − 2𝐶𝑋𝑌
39
Joint Central Moments :
2 2 2 2
𝜎𝑋 = 𝜎𝑋1+𝑋2 = 𝜎𝑋1
+ 𝜎𝑋2
+ 2𝐶𝑋1𝑋2
⋯ ⋯ (3.127)
2 2 2 2
𝜎𝑋1+𝑋2+𝑋3= + 𝜎𝑋1
+ 𝜎𝑋2 𝜎𝑋3
+2𝐶𝑋1𝑋2 + 2𝐶𝑋1𝑋3 + 2𝐶𝑋2𝑋3
40
Joint Central Moments :
In General
41
Joint Central Moments :
42
COVARIANCE MATRIX
43
Joint Central Moments : 𝑪𝑿𝟏 𝑿𝟏 = 𝐸 𝑋1 − 𝑋1 𝑋1 − 𝑋1
= 𝐸 𝑋1 − 𝑋1 2 = 𝝈𝟐𝑿𝟏
o In probability theory & statistics, a
Covariance of a RV with itself
Covariance Matrix (aka auto- is the Variance of that RV.
covariance matrix, dispersion matrix,
variance matrix, or Variance–
covariance Matrix) is a square matrix
giving the covariance between each
pair of elements of a given random
vector.
o Any covariance matrix is symmetric
and positive semi-definite and its
main diagonal contains variances (i.e.,
the covariance of each element with
itself).
o Intuitively, the covariance matrix
generalizes the notion of variance to
multiple dimensions. 44
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum of Three (03) RVs.
N = 1e5; % No. of values of r.v. std(X), std(Y), std(Z),
EX = 1.0018
X = 2 * rand(1,N); std(X)*std(Y)*std(Z)
Y = X.^2;
EY = 1.3374
EZ = 5.6747 ans = 0.5772
Z = 2*Y + 3; ans = 1.1924
S = X + Y + Z; ans = 2.3849
C_XY = 0.6688 ans = 1.6415
%
% C_XZ = 1.3376 C_XYZ =
EX = mean(X), EY = mean(Y), EZ = mean(Z) C_YZ = 2.8578
C_XY = mean(X.*Y)-mean(X)*mean(Y)
C_XZ = mean(X.*Z)-mean(X)*mean(Z) 0.3338 0.6688 1.3376
varX = 0.3338 0.6688 1.4289 2.8578
C_YZ = mean(Y.*Z)-mean(Y)*mean(Z)
%
varY = 1.4289 1.3376 2.8578 5.7156
A = [X;Y;Z].'; varZ = 5.7156
C_XYZ = cov(A) % Covariance Matrix rho =
rho = corrcoef(A) varS = 17.1832
% varS_an = 17.1831 1.0000 0.9684 0.9684
varX = var(X), varY = var(Y), varZ = var(Z) SUM_C_XYZ = 17.1832
varS = var(S)
0.9684 1.0000 1.0000
varS_an = var(X) + var(Y) + var(Z) + 2*C_XY + 0.9684 1.0000 1.0000
2*C_XZ + 2*C_YZ
SUM_C_XYZ = sum(sum(C_XYZ)) 45
Joint Central Moments :
46
§3.3.4 INDEPENDENCE
AND CORRELATION
47
Independence and Correlation :
48
Independence and Correlation :
50
Independence and Correlation :
𝑎𝑋1 + 𝑏 = 𝑎. 𝑋1 + 𝑏
𝑎𝑋1 + 𝑏 − 𝑎𝑋1 + 𝑏
= 𝑎𝑋1 + 𝑏 − 𝑎. 𝑋1 − 𝑏
= 𝑎𝑋1 − 𝑎. 𝑋1
= 𝑎 𝑋1 − 𝑋1
51
Independence and Correlation :
52
Independence and Correlation :
53
Independence and Correlation :
54
Independence and Correlation :
55
Independence and Correlation :
57
Independence and Correlation :
58
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝐸 𝑋1 . 𝑋2 − 𝑋2 . 𝑋1
Independence and Correlation : = 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2
= 𝑚11 − 𝑚𝑋1 𝑚𝑋2 = 𝑹𝑿𝟏 𝑿𝟐 − 𝒎𝑿𝟏 𝒎𝑿𝟐
1
𝑋 3 = න 𝑥 3 . 𝑓𝑋 (𝑥). 𝑑𝑥
−1
1
1 1 1
= න 𝑥 3 . 𝑑𝑥 = 𝑥 4 −1 =0
2 6
−1
59
Independence and Correlation : 2
𝑋 3 = න 𝑥 3 . 𝑓𝑋 (𝑥). 𝑑𝑥
0
2
1 1 2
1
= න 𝑥 3 . 𝑑𝑥 = 𝑥 4 0 = 16 = 2
2 8 8
0
60
Independence and Correlation :
ത 𝑌ത = 𝐸 𝑋 3 − 𝑋.
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝐸 𝑋. 𝑌 − 𝑋. ത 𝑌ത = 2 − 1 4/3 = 0.667
61
Independence and Correlation :
62
Independence and Correlation :
63
Independence and Correlation :
64
Independence and Correlation :
65
Independence and Correlation :
66
Independence and Correlation :
67
§3.4 TRANSFORMATION
OF RANDOM VARIABLES
68
DISCLAIMER
69
These power point slides are NOT
SUBSTITUTE of reading TEXT
BOOK(S).
You’re ALWAYS DIRECTED to
CAREFULLY READ the relevant
book chapter and SOLVE ALL
examples and end problems.
70
REFERENCES :
[1] [Dolecek-2013]
[2] [Kay-2005]