Test of Equal Covariance
Test of Equal Covariance
Tutorial 8 & 9
Tests of Covariance Matrices
Correlation Analysis part 1
April 4, 2023
Solution.
Using the information in Example 6.10, we have n1 = 271, n2 = 138, n3 = 107 and
|S1 | = 2.783 × 10−8 , |S2 | = 89.539 × 10−8 , |S3 | = 14.579 × 10−8 , and |Spooled | = 17.398 ×
10−8 . Taking the natural logarithms of the determinants gives ln |S1 | = −17.397, ln |S2 | =
−13.926, ln |S3 | = −15.741 and ln |Spooled | = −15.564. We calculate
∑
k
∗
−2 log λ = (n − k) log |S| − (ni − 1) log |S i |
i=1
Then consider:
M = −2q log λ∗ ∼ χ2 (ν)
1
where ( )
2p2 + 3p − 1 ∑
k
1 1
q =1− −
6(p + 1)(k − 1) i=1
ni − 1 n − k
.
Plugging in all the numbers, we have M = 285.5 >> 10.8508 = χ2 0(0.05), Referring M
to a χ2 table with v = 4(4 + 1)(3 − 1)/2 = 20 degrees of freedom, it is clear that H0 is
rejected at any reasonable level of significance. We conclude that the covariance matrices of
the cost variables associated with the three populations of nursing homes are not the same.
#2 Six variables were measured in a data set of 60 observations. The summary statistics
are given by
(a) Perform an equicorrelation test for the correlation matrix R, using α = 0.05.
(b) Test for Equal Variances and Total Independence (Sphericity Test)
Solution.
(a) When H0 : ρij = ρ is true for all i ̸= j, the correlation matrix should possess the
following pattern:
1 ρ ··· ρ
ρ 1 ··· ρ
ρ = .. .. ..
. . .
ρ ρ ··· 1
Note that equal variance of measurements is not assumed here. A heuristic test of Lawley
(1963) rejects H0 if
[ ]
n−1 ∑ ∑p
Q= (rij − r̄) − µ̄
2
(r̄k − r̄) > χ2α (f )
2
where ∑ ∑p
2 1
r̄ = p(p−1) i<j rij , r̄k = p−1 i=1 rik
i̸=k
(p−1)2 (1−λ̂2 )
λ̂ = 1 − r̄, µ̄ = p−(p−2)λ̂2
2
and f = (p + 1)(p − 2)/2, with dim(Ω) = p(p + 1)/2 and dim(ω) = p + 1. Here, r̄ can be
interpreted as the pooled estimate of ρ in H0 where r̄k can be regarded as the association of
xk and the other variables in x.
(b)
Under H0 : Σ = σ 2 I where σ 2 is unknown, the covariance matrix Σ should possess the
following pattern:
σ2 0 ··· 0
0 σ2 ··· 0
Σ = .. .. ..
. . .
0 0 ··· σ2
i.e., each measurement should have the same variance and they are uncorrelated (or inde-
pendent under multinormality) with each other.
The large-sample LRT reject H0 if
( )
a
−2 log λ = np log > χ2α (ν)
g
where ν = p(p + 1)/2 − 1 with dim(Ω) = p(p + 1)/2 and dim(ω) = 1, a and g are respectively
the arithmetic mean and geometric mean of the
eigenvalues of the sample covariance matrix S,
a tr(W )/p tr(S)/p
= = ,
g |W | 1/p |S|1/p
where W = (n − 1)S.
3
then the correlation matrix is given by
ρ = D−1 ΣD−1
(α′ Σxy )2
= max ′ s.t. α′ Σxx α = 1
α (α Σxx α) σy2
where D x is a diagonal matrix whose elements are the square roots of elements of Σxx .
The estimator of ρy·x is given by
√
−1
Ry·x = s−2 y S yx S xx S xy
√
= Ryx R−1 xx Rxy
( [ ]−1 [ ]) 12
[ ] 1 −0.75 0.87
= 0.87 −0.44
−0.75 1 −0.44
= 0.9274
(c)
Assuming multinormality of random variable (y, x) and according to property of multi-
normal, the conditional covariance matrix is Σyy|x = Σyy − Σyx Σ−1
xx Σxy , which does not
4
depend on x. Hence, the matrix of partial correlation coefficients is a normalization of the
matrix Σyy|x (i.e. ρ = D −1 ΣD −1 )
( ( ( )))−1 ( ( ( )))−1
ρy|x = sqrt diag Σyy|x Σyy|x sqrt diag Σyy|x
where sqrt(diag(A)) is the diagonal matrix consisting the square root of diagonal elements
of A.
The estimator of Σyy|x is S yy|x = S yy − S yx S −1
xx S xy , hence the estimated matrix of
simple partial correlation is
( ( ( )))−1 ( ( ( )))−1
Ry|x = sqrt diag S yy|x S yy|x sqrt diag S yy|x
then [ ]
1.0000 0.9091
Rx1 x2 |x3 =
0.9091 1.0000
then [ ]
1.0000 0.6516
Rx1 x3 |x2 =
0.6516 1.0000
(2) Based on the results deriving from (c), X1 is actually positively correlated with both X2
and X2 , though the correlation coefficient between X1 and X3 is negative. The underlying
reason is that X3 is negatively correlated with X2 , and simultaneously, the magnitude of
the partial correlation coefficient between X1 and X2 is larger than that between X1 and
X3 . Thus even though X3 is increasing, the corresponding increasing part in X1 could be
counteracted by the decrease resulting from the variation in X2 .
***********************************END*********************************