0% found this document useful (0 votes)
2 views71 pages

CH-3d (Cov, Corr, & Indep)

The document discusses multidimensional random variables, focusing on joint central moments, covariance, and correlation. It explains the relationships between covariance, independence, and uncorrelatedness, emphasizing that independence implies lack of correlation, but not vice versa. Additionally, it introduces the Pythagorean theorem of statistics, which relates the variances of independent random variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views71 pages

CH-3d (Cov, Corr, & Indep)

The document discusses multidimensional random variables, focusing on joint central moments, covariance, and correlation. It explains the relationships between covariance, independence, and uncorrelatedness, emphasizing that independence implies lack of correlation, but not vice versa. Additionally, it introduces the Pythagorean theorem of statistics, which relates the variances of independent random variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Multidimensional Random Variables

1
The first step is to establish
that something is possible;
then probability will occur.
( Elon Musk )
2
CONTENTS:

 JOINT CENTRAL MOMENTS


 INDEPENDENCE AND CORRELATION
 TRANSFORMATION OF RANDOM
VARIABLES

3
§3.3.3 JOINT
CENTRAL MOMENTS

4
Joint Central Moments :

5
Joint Central Moments :

6
CO-VARIANCE

7
Covariance – Second Joint Central Moments :
𝑛 = 1; 𝑘 = 1

8
Recall : Properties of Expected Value :

9
Covariance – Second Joint Central Moments :

10
Covariance – Second Joint Central Moments :

 𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝐸 𝑋1 . 𝑋2 −𝑋1 . 𝑋2 − 𝑋1 . 𝑋2 + 𝑋1 . 𝑋2


= 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 . 𝑋2 + 𝐸 𝑋1 . 𝑋2
= 𝐸 𝑋1 . 𝑋2 − 𝑋2 𝐸 𝑋1 − 𝑋1 𝐸 𝑋2 + 𝑋1 . 𝑋2
= 𝐸 𝑋1 . 𝑋2 − 𝑋2 . 𝑋1 − 𝑋1 . 𝑋2 + 𝑋1 . 𝑋2
= 𝑿𝟏 𝑿𝟐 − 𝑿𝟐 . 𝑿𝟏 = 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2
= 𝑚11 − 𝑚𝑋1 𝑚𝑋2 = 𝑹𝑿𝟏 𝑿𝟐 − 𝒎𝑿𝟏 𝒎𝑿𝟐
 Covariance and Correlation are related together!
11
Covariance – Second Joint Central Moments :

12
Covariance – Second Joint Central Moments :

𝑪𝑿𝟏𝑿𝟐 = 𝝁𝟏𝟏
= 𝑿𝟏 𝑿𝟐 − 𝑿𝟐 . 𝑿𝟏 ⋯ ⋯ (𝟑. 𝟏𝟐𝟑)
= 𝑚11 − 𝑚𝑋1 𝑚𝑋2
= 𝑹𝑿𝟏𝑿𝟐 − 𝒎𝑿𝟏 𝒎𝑿𝟐
 Covariance, Correlation, and first moments are related together!
RECALL: 𝝈𝟐𝑿 = 𝒎𝟐 − 𝒎𝟐𝟏 = 𝑿𝟐 − 𝑿 ഥ 𝟐
 Variance and first two moments are related together! 13
Covariance – Second Joint Central Moments :

𝑪𝑿𝟏 𝑿𝟏 = 𝐸 𝑋1 − 𝑋1 𝑋1 − 𝑋1
2
= 𝐸 𝑋1 − 𝑋1
𝟐
= 𝝈𝑿𝟏
 Covariance of a RV with
itself is the Variance of that RV.
14
Covariance – Second Joint Central Moments :

15
Covariance – Second Joint Central Moments :

The covariance of the independent RVs is equal to zero.

16
Joint Central Moments :

Let us now relate covariance with un-correlated and orthogonal RVs.


Inserting (3.111) into (3.123) 
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝑋1 . 𝑋2 − 𝑋2 . 𝑋1
= 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2
= 𝐸 𝑋1 𝐸 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2 = 0
 If RVs are uncorrelated, Covariance ≡ 𝟎.
17
Joint Central Moments :

18
Covariance – Second Joint Central Moments :
 𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝑿𝟏 𝑿𝟐 − 𝑿𝟐 . 𝑿𝟏 = 𝑹𝑿𝟏 𝑿𝟏 − 𝒎𝑿𝟏 𝒎𝑿𝟐
 Covariance and Correlation are related together!
For independent RVs, Covariance ≡ 𝟎.
For un-correlated, Covariance ≡ 𝟎.
For orthogonal RVs, 𝑹𝑿𝟏𝑿𝟐 = 𝟎 
𝑪𝑿𝟏 𝑿𝟐 = 𝟎 − 𝒎𝑿𝟏 𝒎𝑿𝟐 = −𝐸 𝑋1 𝐸 𝑋2
19
Joint Central Moments :

The covariance 𝑪𝑿𝟏 𝑿𝟐 equals


to zero if the RVs are :
o Either Independent, or
o Dependent but uncorrelated.

20
Covariance – Second Joint Central Moments :

Obviously if 𝑋1 & 𝑋2 are independent, then COV(𝑿𝟏 ,


𝑿𝟐 ) = 0 and 𝑋1 and 𝑋2 will be uncorrelated. In other
words, independence implies lack of correlation.
It should be noted, however, that lack of correlation
does not generally imply independence. That is, the
covariance might be zero, but the random variables may
still be statistically dependent.
21
Covariance – Second Joint Central Moments :
 Two RVs are said to be uncorrelated if their covariance is zero. If two
variables are uncorrelated, there is no linear relationship between
them.
 Covariance is a measure of the joint variability of two RVs.
 For zero-mean RVs, correlation and covariance are same.
 Covariance can be positive, negative, or zero.
 The sign of the covariance shows the tendency in the linear relationship
between the variables.
 The magnitude of the covariance is not easy to interpret because it is
not normalized and hence depends on the magnitudes of the variables.
 The normalized version of the covariance, the correlation coefficient,
however, shows by its magnitude the strength of the linear relation.
22
Uncorrelatedness Vs Independence :

RECALL: If the density has


symmetry around the mean,
then all odd central moments
𝜇1 , 𝜇3 , 𝜇5 , 𝜇7 , ⋯, are equal to
zero.

23
Uncorrelatedness Vs Independence :

If the RVs are independent, they are also


uncorrelated.
However, if the RVs are dependent they can
be either correlated or uncorrelated.
 The dependence is a STRONGER
condition than correlation.
24
EX. NO. 5.23 [5]

25
Ex. No. 5.23 [5] – Dependent but Uncorrelated RVs :

1
𝑓Θ 𝜃 = for 𝜃 = [0, 𝜋]
𝜋

Moreover, 𝐸 𝑋𝑌 = 𝐸 𝑐𝑜𝑠Θ × 𝑠𝑖𝑛Θ Hence, Eq. (2.273) 


 To Compute 𝐸{𝑋}, 𝑓Θ (𝜃)
= 0.5𝐸 2𝑐𝑜𝑠Θ × 𝑠𝑖𝑛Θ shall suffice.
1 1  To Compute 𝐸{𝑌}, 𝑓Θ (𝜃)
= 𝐸 𝑠𝑖𝑛2Θ = 𝐸 𝑔(𝑋) shall suffice.
2 2
26
Ex. No. 5.23 [5] – Dependent but Uncorrelated RVs :

 𝑪𝑿𝒀 = 0  RVs 𝑋 and 𝑌 are UNCORRELATED.


 However, they are NOT independent since 𝑋 2 + 𝑌 2 = 1. 27
VARIANCE OF SUM OF RVS

28
Joint Central Moments :

29
Joint Central Moments :

30
Joint Central Moments :

(3.128) is read as: "The variance of X plus (or minus) Y


is equal to the variance of X plus the variance of Y31."
Joint Central Moments :

32
PYTHAGOREAN THEOREM
OF STATISTICS

33
RECALL – Pythagorean Theorem of Geometry :

The Pythagorean theorem is a geometric


theorem that describes the relationship
between the three sides of a right triangle,
Statement
The sum of the squares of the two
shorter sides of a RIGHT TRIANGLE is
equal to the square of the longest side.
The theorem can be written as an equation
relating the lengths of the sides 𝑎, 𝑏 and
the hypotenuse 𝑐 , sometimes called the
Pythagorean equation:
𝑎2 + 𝑏2 = 𝑐 2
34
Pythagorean Theorem of Statistics :

Pythagorean Theorem of
Statistics:
This theorem states that, for
INDEPENDENT RVs, the
square of the standard
deviation of their sum is the
sum of the squares of their
standard deviations.
35
Pythagorean Theorem of Statistics :

36
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum and Difference
%================================= m_X = 1.0006
N = 1e6; % No. of values of r.v. m_Y = 0.9992
X = 2 * rand(1,N); m_S = 1.9997
Y = 2 * rand(1,N); m_D = 0.0014
% X and Y are Independent RVs
S = X + Y;
var_X = 0.3330
D = X - Y;
%
var_Y = 0.3332
m_X = mean(X) var_S = 0.6652
m_Y = mean(Y) var_D = 0.6673
m_S = mean(S)
m_D = mean(D) cov_XY =
var_X = var(X) 0.3330 -0.0005
var_Y = var(Y) -0.0005 0.3332
var_S = var(S)
var_D = var(D)
SUM_C_XY = 0.6652
cov_XY = cov(X,Y)
SUM_C_XY = sum(sum(cov_XY)) 37
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum and Difference m_X = 0.9998
%=================================
m_Y = -5.0003
N = 1e6; % No. of values of r.v.
X = 2 * rand(1,N); m_S = -4.0005
Y = 2 * X - 7; m_D = 6.0002
% X and Y are Dependent RVs
S = X + Y; var_X = 0.3334
D = X - Y; var_Y = 1.3337
% var_S = 3.0009
m_X = mean(X) var_D = 0.3334
m_Y = mean(Y)
m_S = mean(S) 2 2 2
m_D = mean(D) 𝜎𝑋−𝑌 = 𝜎𝑋 + 𝜎𝑌 − 2𝐶𝑋𝑌 cov_XY =
0.3334 0.6669
var_X = var(X)
var_Y = var(Y) 0.6669 1.3337
var_S = var(S)
var_D = var(D) SUM_C_XY = 3.0009
cov_XY = cov(X,Y)
SUM_C_XY = sum(sum(cov_XY)) 38
Joint Central Moments :

2 2 2
𝜎𝑋−𝑌 = 𝜎𝑋 + 𝜎𝑌 − 2𝐶𝑋𝑌

39
Joint Central Moments :
2 2 2 2
𝜎𝑋 = 𝜎𝑋1+𝑋2 = 𝜎𝑋1
+ 𝜎𝑋2
+ 2𝐶𝑋1𝑋2
⋯ ⋯ (3.127)
2 2 2 2
𝜎𝑋1+𝑋2+𝑋3= + 𝜎𝑋1
+ 𝜎𝑋2 𝜎𝑋3
+2𝐶𝑋1𝑋2 + 2𝐶𝑋1𝑋3 + 2𝐶𝑋2𝑋3
40
Joint Central Moments :
In General

41
Joint Central Moments :

42
COVARIANCE MATRIX

43
Joint Central Moments : 𝑪𝑿𝟏 𝑿𝟏 = 𝐸 𝑋1 − 𝑋1 𝑋1 − 𝑋1
= 𝐸 𝑋1 − 𝑋1 2 = 𝝈𝟐𝑿𝟏
o In probability theory & statistics, a
 Covariance of a RV with itself
Covariance Matrix (aka auto- is the Variance of that RV.
covariance matrix, dispersion matrix,
variance matrix, or Variance–
covariance Matrix) is a square matrix
giving the covariance between each
pair of elements of a given random
vector.
o Any covariance matrix is symmetric
and positive semi-definite and its
main diagonal contains variances (i.e.,
the covariance of each element with
itself).
o Intuitively, the covariance matrix
generalizes the notion of variance to
multiple dimensions. 44
Joint Central Moments :
clear all; close all; clc;
% Variance of Sum of Three (03) RVs.
N = 1e5; % No. of values of r.v. std(X), std(Y), std(Z),
EX = 1.0018
X = 2 * rand(1,N); std(X)*std(Y)*std(Z)
Y = X.^2;
EY = 1.3374
EZ = 5.6747 ans = 0.5772
Z = 2*Y + 3; ans = 1.1924
S = X + Y + Z; ans = 2.3849
C_XY = 0.6688 ans = 1.6415
%
% C_XZ = 1.3376 C_XYZ =
EX = mean(X), EY = mean(Y), EZ = mean(Z) C_YZ = 2.8578
C_XY = mean(X.*Y)-mean(X)*mean(Y)
C_XZ = mean(X.*Z)-mean(X)*mean(Z) 0.3338 0.6688 1.3376
varX = 0.3338 0.6688 1.4289 2.8578
C_YZ = mean(Y.*Z)-mean(Y)*mean(Z)
%
varY = 1.4289 1.3376 2.8578 5.7156
A = [X;Y;Z].'; varZ = 5.7156
C_XYZ = cov(A) % Covariance Matrix rho =
rho = corrcoef(A) varS = 17.1832
% varS_an = 17.1831 1.0000 0.9684 0.9684
varX = var(X), varY = var(Y), varZ = var(Z) SUM_C_XYZ = 17.1832
varS = var(S)
0.9684 1.0000 1.0000
varS_an = var(X) + var(Y) + var(Z) + 2*C_XY + 0.9684 1.0000 1.0000
2*C_XZ + 2*C_YZ
SUM_C_XYZ = sum(sum(C_XYZ)) 45
Joint Central Moments :

46
§3.3.4 INDEPENDENCE
AND CORRELATION

47
Independence and Correlation :

48
Independence and Correlation :

The correlation coefficient is simply the Normalized Version of


the covariance. 49
Independence and Correlation :

50
Independence and Correlation :

𝑎𝑋1 + 𝑏 = 𝑎. 𝑋1 + 𝑏
 𝑎𝑋1 + 𝑏 − 𝑎𝑋1 + 𝑏
= 𝑎𝑋1 + 𝑏 − 𝑎. 𝑋1 − 𝑏
= 𝑎𝑋1 − 𝑎. 𝑋1
= 𝑎 𝑋1 − 𝑋1

51
Independence and Correlation :

52
Independence and Correlation :

53
Independence and Correlation :

54
Independence and Correlation :

55
Independence and Correlation :

𝝆𝑿𝒀 = 𝟎  𝑪𝑿𝒀 = 𝟎  𝑋 & 𝑌 are un-correlated.

𝝆𝑿𝒀 = ±𝟏  𝑋 & 𝑌 are strongly correlated.


56
EX. 3.3.4

57
Independence and Correlation :

58
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝐸 𝑋1 . 𝑋2 − 𝑋2 . 𝑋1
Independence and Correlation : = 𝐸 𝑋1 . 𝑋2 − 𝐸 𝑋1 𝐸 𝑋2
= 𝑚11 − 𝑚𝑋1 𝑚𝑋2 = 𝑹𝑿𝟏 𝑿𝟐 − 𝒎𝑿𝟏 𝒎𝑿𝟐
1

𝑋 3 = න 𝑥 3 . 𝑓𝑋 (𝑥). 𝑑𝑥
−1
1
1 1 1
= න 𝑥 3 . 𝑑𝑥 = 𝑥 4 −1 =0
2 6
−1

59
Independence and Correlation : 2

𝑋 3 = න 𝑥 3 . 𝑓𝑋 (𝑥). 𝑑𝑥
0
2
1 1 2
1
= න 𝑥 3 . 𝑑𝑥 = 𝑥 4 0 = 16 = 2
2 8 8
0

60
Independence and Correlation :

ത 𝑌ത = 𝐸 𝑋 3 − 𝑋.
𝑪𝑿𝟏 𝑿𝟐 = 𝝁𝟏𝟏 = 𝐸 𝑋. 𝑌 − 𝑋. ത 𝑌ത = 2 − 1 4/3 = 0.667
61
Independence and Correlation :

62
Independence and Correlation :

63
Independence and Correlation :

64
Independence and Correlation :

(a) X and Y are dependent and uncorrelated.


(b) X and Y are dependent and correlated.
 This is because X and Y lack linear relationship over [-1,1].
However, X and Y are almost linearly related over [-0,2].
 Correlation or covariance measure linear dependence only.

65
Independence and Correlation :

66
Independence and Correlation :

67
§3.4 TRANSFORMATION
OF RANDOM VARIABLES

68
DISCLAIMER
69
These power point slides are NOT
SUBSTITUTE of reading TEXT
BOOK(S).
You’re ALWAYS DIRECTED to
CAREFULLY READ the relevant
book chapter and SOLVE ALL
examples and end problems.
70
REFERENCES :
[1] [Dolecek-2013]

[2] [Kay-2005]

[3] [Sugiyama-2016] Introduction to Statistical Machine Learning


(Annotated)

[4] [Manolakis-2011] Applied DSP

[5] [Roden-1972] Introduction to Communication Theory


71

You might also like