Probability Stats II Course Overview
Probability Stats II Course Overview
Introduction:
A good number of problems in Statistics involve considering several measurements and/or
attributes that might be related to each other.
For example one may be interested in several characteristics of an individual such as height,
weight, blood pressure and cholesterol level etc. Each of these characteristics can be thought
of as a random variable and the interest is in their dependence which leads to studying
their joint distribution.
The study of the relationship among several random variables is characterised by their
joint probability distribution referred to as multivariate(or multivariable) probabil-
ity distributions.
This course introduces fundamental concepts of multivariate probability distributions.
(ii) Determine means and variances for linear combinations of random vectors
(iii) Derive marginal and conditional probability distributions from multivariate proba-
bility distributions
1
Course outline
To enable the learner to achieve the objectives , the course is organised in the following
lectures:
• Multinomial Distribution
2
Review of univariate and bivariate distributions
Defintions:
2. A sample space (S) is the set of all possible outcomes of a random experiment
5. A probability distribution is the list of all possible values that a random variable
can take on together with their corresponding probabilities.
3
Univariate distributions
Let Y be a discrete random variable with probability distribution function Pr(Y = y) ;
then;
(i) 0 < Pr(Y = y) < 1
P
(ii) y Pr(Y = y) = 1
Properties of expectation
Let Y be a random variable with mean µ and variance σ 2 .
Define a new random variable Z; Z = aY + b; then
E[Z] = aE[Y ] + b = aµ + b
V ar[Z] = a2 V ar[Y ] = a2 σ 2
4
Bivariate Distributions
Two random variables X and Y are said to be jointly distributed if both of them are defined
on the same sample space S; i.e. if outcomes used to define them occur simultaneously in
a random experiment.
(X, Y ) : S → R2
The probability distribution that defines the probabilities of jointly distributed random
variables is called a joint probability distribution or bivariate distribution. the two
jointly distributed random variables are either both discrete or are both continuous.
(a) Let X and Y be two jointly distributed discrete random variables with joint proba-
bility distribution function(j.p.d.f )f (x, y) = Pr(X = x, Y = y)
(b) Let X and Y be two jointly distributed continuous random variables with joint prob-
ability density function(j.p.d.f )f (x, y)
Marginal distributions
for discrete distributions; the marginal distributions of X and Y are given by
X
h(x) = f (x, y)
y
X
g(y) = f (x, y)
x
5
Conditional distributions
The conditional distribution of X given Y is given by
f (x, y)
h(x|Y = y) =
g(y)
Similarly the conditional distribution of Y given X is given by
f (x, y)
g(y|X = x) =
h(x)
Using the conditional distribution we can obtain the conditional mean and variance:
X
E[Y |X = x] = yg(y|X = x) for discrete rv
y
Z
= yg(y|X = x)dy for continuous rv
y
and
X
V ar[Y |X = x] = y 2 g(y|X = x) − [E[Y |X]]2 for discrete rv
y
Z
= y 2 g(y|X = x)dy − [E[Y |X]]2 for continuous rv
y
where
XX
E[XY ] = xyf (x, y)
y x
Z
= inty xyf (x, y)dxdy
x