0% found this document useful (0 votes)
2 views

Probability Stats II Course Overview

Probability

Uploaded by

jkaccessories58
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Probability Stats II Course Overview

Probability

Uploaded by

jkaccessories58
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

STA 3301/SMA 3341.

PROBABILITY & STATISTICS COURSE OVERVIEW

Introduction:
A good number of problems in Statistics involve considering several measurements and/or
attributes that might be related to each other.
For example one may be interested in several characteristics of an individual such as height,
weight, blood pressure and cholesterol level etc. Each of these characteristics can be thought
of as a random variable and the interest is in their dependence which leads to studying
their joint distribution.
The study of the relationship among several random variables is characterised by their
joint probability distribution referred to as multivariate(or multivariable) probabil-
ity distributions.
This course introduces fundamental concepts of multivariate probability distributions.

Expected learning outcomes


By the end of this course, the learner should be able to:

(i) Perform calculations involving multivariate probability distributions

(ii) Determine means and variances for linear combinations of random vectors

(iii) Derive marginal and conditional probability distributions from multivariate proba-
bility distributions

(iv) Derive the distribution of a transformation of jointly distributed random variables

(v) Demonstrate understanding of properties of Multivariate Normal Distribution

(vi) Apply Multinomial distribution

1
Course outline
To enable the learner to achieve the objectives , the course is organised in the following
lectures:

• Random vectors: joint probability of random vector, mean vector, variance-covariance


matrix, correlation matrix; linear combinations of random vectors

• Generating functions: Joint probability and moment generating functions, character-


istic functions

• Deriving the distributions of several random variables using Distribution function


technique, change of variable technique and generating function technique

• Sampling distributions of sample mean and sample variance

• Multinomial Distribution

• Multivariate Normal distribution

Reference text books

(1) Probability and Statistics by Morris De Groot

(2) Introduction to Mathematical Statistics by Robert Hogg and Craig

(3) Probability and Statistics for Engineers & Scientists by Myers W.

2
Review of univariate and bivariate distributions
Defintions:

1. A random experiment is an experiment or observation where we know the set of


all possible outcomes but cannot predict a particular outcome with certainty.

2. A sample space (S) is the set of all possible outcomes of a random experiment

3. Probability is a measure of the likelihood of an event occurring. This measure which


quantifies uncertainty lies between zero and one.

4. A random variableis a function that assigns numerical values to each of possible


outcome in a sample space S.
X:S→R

A random variable can be either discrete (having specific values) or continuous(any


value in a continuous range)

5. A probability distribution is the list of all possible values that a random variable
can take on together with their corresponding probabilities.

6. Probability distributions associated with discrete random variables are referred to as


probability distribution functions while those associated with continuous random
variables are referred to as probability density functions

3
Univariate distributions
Let Y be a discrete random variable with probability distribution function Pr(Y = y) ;
then;
(i) 0 < Pr(Y = y) < 1
P
(ii) y Pr(Y = y) = 1

Let Y be a continuous random variable with probability density function f (y);


(i) 0 < f (y) < 1
R
(ii) y f (y)dy = 1

(iii for any two possible values of Y , a and b;


Z b
Pr(a ≤ Y ≤ b) = f (y)dy
a

The mean of a random variable:


X
µ = E[Y ] = yPr(Y = y) for discrete rv
y
Z
= yf (y)dy for continuous rv
y

and the variance:


σ 2 = E[Y 2 ] − µ2
where
X
E[Y 2 ] = y 2 Pr(Y = y) for discrete rv
y
Z
= y 2 f (y)dy for continuous rv
y

Properties of expectation
Let Y be a random variable with mean µ and variance σ 2 .
Define a new random variable Z; Z = aY + b; then

E[Z] = aE[Y ] + b = aµ + b
V ar[Z] = a2 V ar[Y ] = a2 σ 2

4
Bivariate Distributions
Two random variables X and Y are said to be jointly distributed if both of them are defined
on the same sample space S; i.e. if outcomes used to define them occur simultaneously in
a random experiment.
(X, Y ) : S → R2
The probability distribution that defines the probabilities of jointly distributed random
variables is called a joint probability distribution or bivariate distribution. the two
jointly distributed random variables are either both discrete or are both continuous.

(a) Let X and Y be two jointly distributed discrete random variables with joint proba-
bility distribution function(j.p.d.f )f (x, y) = Pr(X = x, Y = y)

(i) 0 < f (x, y) < 1


P P
(ii) y x f (x, y) = 1

(b) Let X and Y be two jointly distributed continuous random variables with joint prob-
ability density function(j.p.d.f )f (x, y)

(i) 0 < f (x, y) < 1


R R
(ii) y x f (x, y)dxdy = 1
(iii) for any two possible values of Y , a and b; and two possible values of X, c and
d: Z bZ d
Pr(a ≤ Y ≤ b, c ≤ X ≤ d) = f (x, y)dxdy
a c

Marginal distributions
for discrete distributions; the marginal distributions of X and Y are given by
X
h(x) = f (x, y)
y
X
g(y) = f (x, y)
x

or continuous distributions; the marginal distributions of X and Y are given by


Z
h(x) = f (x, y) dy
y
Z
g(y) = f (x, y)dx
x

5
Conditional distributions
The conditional distribution of X given Y is given by
f (x, y)
h(x|Y = y) =
g(y)
Similarly the conditional distribution of Y given X is given by
f (x, y)
g(y|X = x) =
h(x)
Using the conditional distribution we can obtain the conditional mean and variance:
X
E[Y |X = x] = yg(y|X = x) for discrete rv
y
Z
= yg(y|X = x)dy for continuous rv
y

and
X
V ar[Y |X = x] = y 2 g(y|X = x) − [E[Y |X]]2 for discrete rv
y
Z
= y 2 g(y|X = x)dy − [E[Y |X]]2 for continuous rv
y

Covariance and correlation


the covariance between two variables X and Y is given by

Cov(X, Y ) = E[XY ] − E[X]E[Y ]

where
XX
E[XY ] = xyf (x, y)
y x
Z
= inty xyf (x, y)dxdy
x

and the correlation between the two variables is given by


Cov(X, Y )
Corr(X, Y ) = p
V ar(X)V ar(Y )

You might also like