Sta 242 Bivariate Analysis
Sta 242 Bivariate Analysis
Recall:
Let X be a random variable with a probability function . The moment generating function
(m.g.f) of a random variable X is given by;
As in the case of moment generating functions for the univariate case, one can define the
moment generating function for the bivariate case in order to compute the various product
moments.
Definition: Let X and Y be two random variables with joint density function f (x, y). Then the
Joint Moment Generating Function of X and Y is defined by
Note:
The above joint m.g.f will exist only iff the double integral (summation) converges
The joint m.g.f can be used to obtain the various moments of X and Y. This is done by
Recall:
It is easy to see from this definition of mgf that;
and
To find we find the derivative of the joint mgf with respect to s as follows:
Marginal m.g.f.
and
Recall that:
Example 8.1:
Let X and Y be two continuous random variables with joint probability density function
Determine:
a)
i.
ii.
iii.
Solution:
a)
b)
i.
Therefore,
ii.
Therefore,
iii.
But
Hence;
Thus;
and
Correlation XY is;
and
However,
Exercise 8.1
Exercise 8.3
Suppose that the random variables X and Y have a joint pdf given by
In statistical methods, results of statistical hypothesis testing and estimation involve functions of
one or more r.vs. As a result, statistical inference requires the distribution of these functions.
Here we find the probability of various functions of r,vs such as sums, difference, products &
quotients.
Suppose that X is a random variable with distribution function F(x) and let U=g(x) be a function
of X. Then U is also a r.v. The problem here is to find the probability distribution of U. We shall
study three methods of solving this problem. They include:
Let X be a discrete random variable with pdf . Let Y=u(x) define a one-to-one
transformation between the variables X and Y, so that the equation y=u(v) can be uniquely
solved for x in terms of y, say, x=ω(y).
Solution:
Since the values of X are all positive, the transformation defines a one-to-one correspondence
between X and Y values
Example 10.2
X -3 -2 -1 0 1 2 3
f(x) 4 1 1 1 1 1 4
21 6 14 7 14 6 21
Solution:
Y 1 4 13 28
f(y) 4 1 1 8
7 7 3 21
Suppose that the pdf of X is f(x) and let Y=u(x) be another r.v. The general procedure
Distribution function(cdf) technique is as follows:
Example 10.3
Solution:
Example 10.4
Solution:
Example 10.5
Solution:
II) Transformation of variables Technique
Suppose that X is a continuous random variable with p.d.f . Let define a one-to-
one transformation between the values of X and Y.
and
Example 10.6
Solution:
Example 10.7
Solution:
Exercise 10.1:
Exercise 10.2:
Exercise 10.3:
Exercise 10.4:
Exercise 10.5:
Remark:
Thus;
Exercise 10.6:
Exercise 10.7:
Exercise 10.8:
Suppose that are discrete random variables with joint pdf . Let
define a one-to-one transformation between the points
From this joint pdf, we can obtain the Marginal pdfs of as shown below
Example 10.8
pdf of where
Find the joint
Solution:
Example 10.9
Thus,
We therefore conclude that, the sum of two independent random variables having a Poisson
distribution with parameters has a Poisson distribution with parameter
Exercise 10.8:
Let have a joint p.d.f given by
The Marginal pdfs of can be obtained from the joint pdf in the usual
manner.
Example 10.10
Suppose X1 and X2 are independent random variables each uniformly distributed over the
interval [0, 1]. Determine the joint pdf of and hence find the
respective marginal densities of Y1 and Y2.
Solution
The joint pdf of X1 and X2 is
but
Therefore
When
Where
Example 10.11
Find
i. Joint density of
ii. Marginal of Y1
Solution
If
Then
Thus
Using ,
when , then
Using ,
when , then
Task: Students to plot the graph of against and the graph of transformed sample space on
the plane.
Therefore,
ii. Marginal of
Hence,
Exercise 10.9:
iii. Marginal of (left as an exercise)
Exercise 10.10:
ii. Marginal of
iii. Marginal of
Exercise 10.11:
Let and be two independent standard normal random variables
.
ii. Marginal of
iii. Marginal of
Exercise 10.12:
Let and be two independent random variables having gamma distribution,
i. Marginal of
ii. Marginal of
Exercise 10.13:
The joint p.d.f of and is given by,
Find the;
a. Joint density of
b. Marginal of
c. Marginal of
Exercise 10.14
The m.g.f technique for determining the distribution of a function of one or more random
variables is based on the fact that, when it exists; (1) it is unique and (2) that it uniquely
determines the probability distribution.
Theorem 10.1:
Theorem 10.2:
Let X be a random variable with p.d.f . If are constants, then, the m.g.f of
is
Theorem 10.3:
Theorem 10.4:
Recall that;
Note:
Suppose that is a continuous random variable with p.d.f and let be a
This technique is also applicable when we are interested in finding the joint distribution
of two random variables. Thus suppose that X and Y are jointly continuous random
Remark:
Example 10.12
Suppose that are independent Bernoulli random variables. Find the distribution of
Solution:
where,
Thus,
Re-arranging gives;
Example 10.14
Let be statistically independent random variables with normal distributions
Solution:
Example 10.15
Let the independent have the same p.d.f
(since independent)
But
and
Thus,
2 3 4 5 6 7 8 9 10 11 12
Exercise 10.14:
i. Suppose that are independently distributed random variables each having
ii. Use the m.g.f technique to find the probability distribution of the sum of independent
parameters
Exercise 10.14:
If are independent random variables having exponential distribution with the
same parameter . Find the p.d.f of the random variable . Use the m.g.f
technique.
Exercise 10.15:
Let be two independent standard normal variables, that is . Use the m.g.f
technique to find the joint distribution of
Exercise 10.16:
Let be two independent standard normal variables. Use the m.g.f technique to find the
distribution of
Exercise 10.17:
Let the stochastically independent random variables have the same p.d.f
This is a special case of the Gamma distribution. Recall the pdf of the Gamma distribution which
is given by
Remarks:
The Chi-square distribution also arises from transforming the normal distribution function by
squaring it.
If
a)
b)
Let and let and assume that Z and V are independent. Then the joint pdf of Z and V,
say g(z,v) is the product of the pdfs of Z and V thus
on the set
Therefore
The joint pdf of T ad U is given by
h(t) is the t-distribution. Thus if and and if Z and V are idependent, then has
a student’s t-distribution which is completely defined by the parameter r, called degrees of freedom of
the distribution, that is,
Remark:
The t-distribution is symmetrical about t=0 and the areas under its curve are evaluated in the
same manner as in the case of the normal distribution.
To obtain the mean and variance of the t-distribution, we have that and since Z and V are
independent, we have
Example 11.1
Solution:
Example 11.2
Let T have a t-distribution with 10 d.f. Determine c such that
Solution:
Corollary
Let be independent random variables that are all normal with mean and standard
deviation .
Consider two independent Chi-square random variables, U and V having r1 and r2 d.f
The F-distribution is completely described by two parameters r 1 and r2 which are known as d.f, that is
If F has an F-distribution with r1 and r2 d.f, then has an F-distribution with r1 and r2 d.f. That is
if
Task:
Then X and Y are said to have a bivariate normal distribution. We can show that
i)
ii) ρ is the correlation coefficient between X and Y
The moments of f(x,y) may be obtained by evaluating the appropriate derivatives of M(s,t) at
s=t=0. Thus
Similarly,
It follows that the mean vector and covariance matrix of the bivariate normal distribution are
given as
If ρ=0, that is if X and Y are uncorrelated, then the bivariate normal density function takes the
form
Therefore, if ρ=0, then X and Y are independently distributed as univariate normal variable. That
is
Marginal Densities
Similarly,
Therefore if X and Y have a bivariate normal distribution, then the marginal distributions of X
and Y are univariate normal distributions.
Conditional Densities
Theorem:
If (X1,X2) has a bivariate normal distribution, then the conditional distribution of X1 given that
X2=x2 is normal with mean
Similarly, the conditional distribution of X2 given that X1=x1 is normal with mean
That is
And
Proof
Find:
i) P(3< X2<8)
ii) P(3< X2<8| X1=7)
iii) P(-3< X1<3| X2=-4)
Solutions
Consider a bivariate normal distribution with mean vector and covariance matrix
Let us write f(x.y) in the form
To identify µ1 and µ2 we differentiate Q with respect to X and Y respectively and solve the
equations
In order to identify the covariance matrix , we first note that Q can be written in the form
Note that
Example 12.1
1. Suppose that X and Y have a bivariate normal distribution with density function
Solution:
And by substitution
which gives
Now,
And therefore
Example 12.2
. If W=3x-2y, find
P(-2<W<19)
Solution:
Now,