0% found this document useful (0 votes)
109 views46 pages

Sta 242 Bivariate Analysis

1. The document discusses joint moment generating functions (MGFs) and how they can be used to find various product moments of two random variables X and Y. 2. The joint MGF of continuous random variables X and Y is defined as the double integral of e^(sX + tY) with respect to the joint density function f(x,y). For discrete random variables, it is defined as the summation. 3. The joint MGF can be differentiated to find various moments like E(X), E(Y), E(XY), and others. Marginal MGFs can also be derived by setting one variable's exponent to 0.

Uploaded by

Jared Mutinda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
109 views46 pages

Sta 242 Bivariate Analysis

1. The document discusses joint moment generating functions (MGFs) and how they can be used to find various product moments of two random variables X and Y. 2. The joint MGF of continuous random variables X and Y is defined as the double integral of e^(sX + tY) with respect to the joint density function f(x,y). For discrete random variables, it is defined as the summation. 3. The joint MGF can be differentiated to find various moments like E(X), E(Y), E(XY), and others. Marginal MGFs can also be derived by setting one variable's exponent to 0.

Uploaded by

Jared Mutinda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 46

JOINT MOMENT GENERATING FUNCTIONS

Recall:

Let X be a random variable with a probability function . The moment generating function
(m.g.f) of a random variable X is given by;

As in the case of moment generating functions for the univariate case, one can define the
moment generating function for the bivariate case in order to compute the various product
moments.
Definition: Let X and Y be two random variables with joint density function f (x, y). Then the
Joint Moment Generating Function of X and Y is defined by

If X and Y are continuous, then

And if X and Y are discrete, then

Note:

 The above joint m.g.f will exist only iff the double integral (summation) converges

absolutely to a finite number for values of which lie in the interval ;

where is a positive real number.

 The joint m.g.f can be used to obtain the various moments of X and Y. This is done by

differentiating and evaluating the derivative as and .

Recall:
It is easy to see from this definition of mgf that;

and

Now, using the discrete case we have that

To find we find the derivative of the joint mgf with respect to s as follows:

The second derivative of the mgf with respect to s is

It follows from above that,

Thus, the p-th moment of random variable X is;

By extension, the q-th moment of random variable Y is;


Next we have;

Thus, the joint moment of X and Y is given by;

Marginal m.g.f.

Let the joint m.g.f of X and Y be .

If we set , is called the marginal m.g.f. of X

and

if we set , is called the marginal m.g.f. of Y.

If X and Y are independent, then

Recall that:
Example 8.1:

Let X and Y be two continuous random variables with joint probability density function

Determine:

a)

b) Using the m.g.f obtained above, determine;

i.

ii.

iii.

c) The marginal m.g.fs of X and Y.

d) Are X and Y independent

Solution:

a)
b)

i.

Therefore,

ii.

Therefore,

iii.

But

Hence;
Thus;

and

Correlation XY is;

c) The marginal m.g.fs of X and Y

Marginal m.g.f. of X (set )

and

Marginal m.g.f. of Y (set )

d) X and Y independent iff:

However,

Thus X and Y are NOT independent

Exercise 8.1

The joint mgf of the random variables X and Yis given by

Find the covariance of X and Y


Exercise 8.2

The joint pdf of X and Y is given by


Determine the joint mgf of X and Y

Exercise 8.3

Suppose that the random variables X and Y have a joint pdf given by

Find the covariance of X and Y

DISTRIBUTION OF FUNCTIONS OF RANDOM VARIABLES

In statistical methods, results of statistical hypothesis testing and estimation involve functions of
one or more r.vs. As a result, statistical inference requires the distribution of these functions.
Here we find the probability of various functions of r,vs such as sums, difference, products &
quotients.

Suppose that X is a random variable with distribution function F(x) and let U=g(x) be a function
of X. Then U is also a r.v. The problem here is to find the probability distribution of U. We shall
study three methods of solving this problem. They include:

i. Distribution function(cdf) technique


ii. Transformation of variables technique
iii. Moment generating function (mgf) technique

CASE 1: FUNCTIONS OF ONE RANDOM VARIABLE (UNIVARIATE CASE)

a) DISCRETE RANDOM VARIABLES

Transformation of variables technique:

Let X be a discrete random variable with pdf . Let Y=u(x) define a one-to-one
transformation between the variables X and Y, so that the equation y=u(v) can be uniquely
solved for x in terms of y, say, x=ω(y).

Then the probability distribution of Y is given by


Example 10.1

Let X be a geometric r.v with pdf given as

Find the probability distribution of

Solution:

Since the values of X are all positive, the transformation defines a one-to-one correspondence
between X and Y values

Example 10.2

Suppose that X has a discrete distribution as shown below

X -3 -2 -1 0 1 2 3

f(x) 4 1 1 1 1 1 4
21 6 14 7 14 6 21

Find the distribution of

Solution:

Possible values of Y are Y=1, 4, 13, 28


The distribution of Y is given below

Y 1 4 13 28

f(y) 4 1 1 8
7 7 3 21

b) CONTINUOUS RANDOM VARIABLES


I) Distribution Function (c.d.f) Technique

Suppose that the pdf of X is f(x) and let Y=u(x) be another r.v. The general procedure
Distribution function(cdf) technique is as follows:

 Obtain the cdf of Y

 Differenciate with respect to y in order to obtain g(y).


 Determine the values of Y in the range of Y for which g(y)>0

The pdf of Y will be given by

Example 10.3

Suppose that the pdf of X is

Solution:
Example 10.4

Suppose that X has the pdf

Solution:

Example 10.5

The random variable X has pdf

Solution:
II) Transformation of variables Technique

Suppose that X is a continuous random variable with p.d.f . Let define a one-to-
one transformation between the values of X and Y.

, where F is the distribution function


of X.

Then the p.d.f of Y, is given by

if is a strictly increasing function.

If however, is a strictly decreasing function, then we have

and

But in this case and therefore,


, where and is called the Jacobian of the
transformation.

Example 10.6

Let X be a continuous random variable with p.d.f

Find the probability distribution of

Solution:

Take note of the new limits!

Example 10.7

Let X be a continuous random variable with p.d.f

Find the probability distribution of

Solution:

Take note of the new limits!


In general:

 If X is a continuous random variable with p.d.f and define a one-to-one


transformation between the values of X and Y, then, the p.d.f of Y is given by:

, where is the Jacobian of the transformation.


 Take note of the new limits!

Exercise 10.1:

The p.d.f of a continuous random variable X is given by:

Find the p.d.f of , i.e.

Exercise 10.2:

The p.d.f of a continuous random variable X is given by:

Find the p.d.f of , i.e.

Exercise 10.3:

If the p.d.f of a continuous random variable X is given by:


Find the p.d.f of , i.e.

Exercise 10.4:

If the p.d.f of a continuous random variable X is given by:

Find the p.d.f of the random variable , i.e.

Exercise 10.5:

The p.d.f of a continuous random variable X is given by:

where k is a constant. Find the p.d.f of , i.e.

Remark:

If the transformation is not one-to-one or is not a monotone function of X, we cannot


apply the above method directly.

For example, suppose that

Consider the transformation

Thus;

Take note of the new limits!


Hence,

Exercise 10.6:

Let X be a continuous random variable with p.d.f given by:

Find the p.d.f of , i.e.

Exercise 10.7:

Let X be a continuous random variable with p.d.f given by:

Find the p.d.f of , i.e.

Exercise 10.8:

Let X be a continuous random variable with p.d.f given by:

Let . Find the p.d.f of using:


i. The CDF technique
ii. The transformation of variables technique.

Next, we consider functions of two random variables

CASE 2: FUNCTIONS OF TWO RANDOM VARIABLE (BIVARIATE CASE)

a) DISCRETE RANDOM VARIABLES


I) Transformation of variables technique

Suppose that are discrete random variables with joint pdf . Let
define a one-to-one transformation between the points

such that . Then the jpdf of is


given by

From this joint pdf, we can obtain the Marginal pdfs of as shown below

Example 10.8

Let have joint p.d.f specified by;

pdf of where
Find the joint

Solution:

The possible values of are

Example 10.9

Let be two independent random variables having Poisson distributions with


parameters respectively. Find the distribution of the random variable and
the marginal p.d.f of .
Solution:

Since are two independent random variables, then

If we need to define a second random variable, say .


The inverse function is given by:
and
The joint pdf of is given by

Thus,

Using binomial expansion we have;

We therefore conclude that, the sum of two independent random variables having a Poisson
distribution with parameters has a Poisson distribution with parameter

Exercise 10.8:
Let have a joint p.d.f given by

Find the joint p.d.f of and , hence the marginal p.d.f of .


b) CONTINUOUS RANDOM VARIABLES

I) Distribution Function (cdf) Technique

II) Transformation of variables technique

Let be jointly distributed random variables with pdf such that


Let define a one-to-one transformation between
the points . The equations may be

uniquely solved for in terms of say. Then the joint pdf of is


given by

where J is called the Jacobian of transformation and is


the determinant i.e

The Marginal pdfs of can be obtained from the joint pdf in the usual
manner.

Example 10.10

Suppose X1 and X2 are independent random variables each uniformly distributed over the
interval [0, 1]. Determine the joint pdf of and hence find the
respective marginal densities of Y1 and Y2.

Solution
The joint pdf of X1 and X2 is

but

Therefore

When

(Students to plot the graph of y against x)

Since the transformation is one-to-one, the


partial derivatives are continuous and the
Jacobian does not vanish. The joint pdf of
is given by
Marginal pdf of Y1 is given by

Where

Which on substitution becomes

Similarly, the marginal pdf of Y2 is and is given by

Which on substitution becomes

Example 10.11

If the joint density of is given by

Find
i. Joint density of
ii. Marginal of Y1
Solution

If

Then

Thus

Since the transformation is one-to-one, mapping the region in the


plane into the region in the plane.

Next, consider the new intervals for the variables:

Using ,

when , then , and

when , then

Using ,

when , then , and

when , then

Task: Students to plot the graph of against and the graph of transformed sample space on
the plane.

Therefore,
ii. Marginal of

, where the interval is

Hence,

Exercise 10.9:
iii. Marginal of (left as an exercise)

Exercise 10.10:

Let and be two independent standard normal random variables


.

Let . Find the:

i. Joint pdf of and

ii. Marginal of

iii. Marginal of
Exercise 10.11:
Let and be two independent standard normal random variables
.

Let . Find the:

i. Joint pdf of and

ii. Marginal of

iii. Marginal of

Exercise 10.12:
Let and be two independent random variables having gamma distribution,

and . Find the distribution of .

Hence, find the .

i. Marginal of

ii. Marginal of

Exercise 10.13:
The joint p.d.f of and is given by,

Find the;

a. Joint density of

b. Marginal of

c. Marginal of
Exercise 10.14

The joint pdf of X1 and X2 is given by

III) Moment generating function (m.g.f) technique

 The m.g.f technique can be used to determine the probability distribution/density of a


function of a random variable when the function is a linear combination of n-independent
random variables.

 The m.g.f technique for determining the distribution of a function of one or more random
variables is based on the fact that, when it exists; (1) it is unique and (2) that it uniquely
determines the probability distribution.

Important theorems on moment generating function:

Let X be a random variable with p.d.f . Then, the m.g.f of X is

Theorem 10.1:

Let X be a random variable with p.d.f . If is a constant, then, the m.g.f of is

Theorem 10.2:

Let X be a random variable with p.d.f . If are constants, then, the m.g.f of
is
Theorem 10.3:

If are independent random variables, and , then

Theorem 10.4:

If be mutually stochastically independent random variables having normal

distributions . The random variables


, where are real constants is normal distributed with
mean and variance , that is

The m.g.f of the random variable is determined as follows:

Recall that;

It follows from above that;

Note:
 Suppose that is a continuous random variable with p.d.f and let be a

new random variable. The m.g.f of is given by

 This technique is also applicable when we are interested in finding the joint distribution
of two random variables. Thus suppose that X and Y are jointly continuous random

variables with p.d.f . Let be two new random


variables. Thus;

Remark:

 The integral signs changes to summation when X and Y are discrete.

Example 10.12
Suppose that are independent Bernoulli random variables. Find the distribution of

Solution:

where,
Thus,

is the m.g.f of a Binomial distribution.

Hence, has a Binomial distribution with parameters


Example 10.13
Suppose . Find the distribution of using the m.g.f technique.
Solution:

Re-arranging gives;

Since the integral equal to 1, we have;

Notice that, is the m.g.f of a Chi-square distribution with

Example 10.14
Let be statistically independent random variables with normal distributions

and . Define the random variable . Find the p.d.f of


using the m.g.f technique.

Solution:

Recall that, if , then its m.g.f is given by .

Hence, for , we shall have:


Which clearly shows that is normal with mean and variance , that is;

Example 10.15
Let the independent have the same p.d.f

Find the p.d.f of using the m.g.f


Solution:

(since independent)
But

and

Thus,

Expand and collect like terms together:

Therefore, the p.d.f of , i.e. is zero except at


i.e.

2 3 4 5 6 7 8 9 10 11 12

Exercise 10.14:
i. Suppose that are independently distributed random variables each having

Poisson distribution with parameter . Show that, has a Poisson distribution

with parameter . (Use the m.g.f technique).

ii. Use the m.g.f technique to find the probability distribution of the sum of independent

random variables having Poisson distribution with the respective

parameters

Exercise 10.14:
If are independent random variables having exponential distribution with the
same parameter . Find the p.d.f of the random variable . Use the m.g.f
technique.

Exercise 10.15:
Let be two independent standard normal variables, that is . Use the m.g.f
technique to find the joint distribution of

Exercise 10.16:
Let be two independent standard normal variables. Use the m.g.f technique to find the

distribution of

Exercise 10.17:
Let the stochastically independent random variables have the same p.d.f

Find the distribution of

11.0 SPECIAL PROBABILITY DISTRIBUTIONS

These distributions arise from

i) Applying special conditions to some probability distributions and


ii) Transformation of variables.
11.1 The -Distribution

This is a special case of the Gamma distribution. Recall the pdf of the Gamma distribution which
is given by

When where r is a positive integer, the pdf becomes

This is the pdf of a Chi-Square distribution with r degrees of freedom. That is

Expected value & Variance


Moment Generating Function

We can use mgf to find E(X) and Var(X) as follows

Remarks:

The Chi-square distribution also arises from transforming the normal distribution function by
squaring it.

Recall that if;

follows a Chi-square distribution with 1 d.f.


If are n independent normal variates with mean and variance

If

Following from above, we have that

a)

b)

11.2 The t-Distribution

Let and let and assume that Z and V are independent. Then the joint pdf of Z and V,
say g(z,v) is the product of the pdfs of Z and V thus

Define a random variable

We shall use the change of variable technique to obtain the pdf of T.

thus defined as a one-to-one mapping of the set

on the set

Therefore
The joint pdf of T ad U is given by

The marginal pdf of T is given by


Therefore;

h(t) is the t-distribution. Thus if and and if Z and V are idependent, then has
a student’s t-distribution which is completely defined by the parameter r, called degrees of freedom of
the distribution, that is,

Remark:

The t-distribution is symmetrical about t=0 and the areas under its curve are evaluated in the
same manner as in the case of the normal distribution.

Mean and Variance of the t-distribution

To obtain the mean and variance of the t-distribution, we have that and since Z and V are
independent, we have
Example 11.1

A random variable T has a t-distribution with 12 d.f. Compute

Solution:

Using the t-tables with r=12 we have;

Example 11.2
Let T have a t-distribution with 10 d.f. Determine c such that

Solution:

Using the t-distribution with 10 d.f we have

Corollary

Let be independent random variables that are all normal with mean and standard
deviation .

The the random variable has a t-distribution with v=n-1 d.f

11.3 The F-Distribution

Consider two independent Chi-square random variables, U and V having r1 and r2 d.f

respectively, that is and and the jpdf of U and V is

We define a new random variable and we want to find the pdf of

These equations define a one-to-one transformation that maps the set


on to the set

Therefore, the jpdf g(f,z) is given by

The marginal pdf g(f) is given by

If we change the variable of integration by

Therefore the marginal pdf becomes

The F-distribution is completely described by two parameters r 1 and r2 which are known as d.f, that is

Probabilities are approximated using the formula below:


Remark:

If F has an F-distribution with r1 and r2 d.f, then has an F-distribution with r1 and r2 d.f. That is
if

Mean and Variance of the F-Distribution

Task:

Bivariate Normal Distribution

Let X and Y have a joint pdf given by

Then X and Y are said to have a bivariate normal distribution. We can show that

a) f(x,y) is a pdf where,

i)
ii) ρ is the correlation coefficient between X and Y

To show that f(x,y) is a pdf, we need to show that

To simplify the integral, we substitute


Therefore

On completing the squares on u in the exponent, we have

The above integral becomes

Moment Generating Functions

The mgf of a bivariate normal distribution can be determined as follows

Therefore, the mgf is expressed as

The combined exponents in the integral may be written as


And on completing the square first on u and then on v this expression becomes

Substituting these expressions in the exponent, it becomes

Upon substituting in the expression for mgf, the mgf become

Since the integral equals 1 (pdf).

The moments of f(x,y) may be obtained by evaluating the appropriate derivatives of M(s,t) at
s=t=0. Thus

Similarly,

Hence the covariance of X and Y is given by


And the coefficient of correlation is given by

It follows that the mean vector and covariance matrix of the bivariate normal distribution are
given as

If ρ=0, that is if X and Y are uncorrelated, then the bivariate normal density function takes the
form

Therefore, if ρ=0, then X and Y are independently distributed as univariate normal variable. That
is

It also follows that if ρ=0, then

Thus X and Y are stochastically independent when ρ=0.

Marginal Densities

By definition, the marginal density of X is

Similarly,
Therefore if X and Y have a bivariate normal distribution, then the marginal distributions of X
and Y are univariate normal distributions.

Conditional Densities

Theorem:

If (X1,X2) has a bivariate normal distribution, then the conditional distribution of X1 given that
X2=x2 is normal with mean

Similarly, the conditional distribution of X2 given that X1=x1 is normal with mean

That is

And

Proof

The conditional density of X1 given X2 is defined as

Which is the univariate normal density with mean


Examples

Let X1 and X2) have a bivariate normal distribution with parameters

Find:

i) P(3< X2<8)
ii) P(3< X2<8| X1=7)
iii) P(-3< X1<3| X2=-4)

Solutions

Identifying the Mean Vector and Covariance Matrix

Consider a bivariate normal distribution with mean vector and covariance matrix
Let us write f(x.y) in the form

To identify µ1 and µ2 we differentiate Q with respect to X and Y respectively and solve the
equations

In order to identify the covariance matrix , we first note that Q can be written in the form

where a, b, c, d, e and f are constants. To identify the values of the parameters , we


consider only the quadratic part,

and compare it with the corresponding part in Q, thus

Expressing in terms of a, b and c we obtain

Note that

Thus to identify we simply need to invert the matrix,


Where, a, b and c are the coefficients in the quadratic part of Q

Example 12.1

1. Suppose that X and Y have a bivariate normal distribution with density function

Identify the mean vector µ and the covariance matrix

Solution:

Solving the system of simultaneous equations by elimination method we have

And by substitution

The quadratic part of Q is

which gives

Now,

And therefore
Example 12.2

Let X and Y have a bivariate normal distribution with parameters

. If W=3x-2y, find

P(-2<W<19)

Solution:

Now,

You might also like