Probability Mammadli Ilgar
Probability Mammadli Ilgar
Subject:Probability Theory
Topic:Two-dimensional random quantities,their
conditional distribution and density functions
Student:Mammadli Ilgar
Teacher:Azizova Rugiya
Table of contents
Probability density
01 Introduction 04 functions (PDFs)
Conditional Probability
02 Types of random variables 05 Distributions
where is the probability that the realization of the random variable X will be equal to x.
The probabilities associated with all (hypothetical) values must be non-negative and sum
up to 1,
And ≥0
Thinking of probability as mass helps to avoid mistakes since the physical mass is
conserved as is the total probability for all hypothetical outcomes.
Example:Suppose a random variable X can take only three
values (1, 2 and 3), each with equal probability. Its probability
mass function is
Here, we call the marginal PMF of X. Similarly, we can find the marginal
PMF of Y as
Independence of discrete random variables
Definition: Two random variables X and Y are said to be independent if and only if
where is their joint probability density function and and are their marginal
probability density functions.
Conditional Probability Distributions
We use this same concept for events to define conditional probabilities for random
variables.
If X and Y are discrete random variables with joint pmf given by p(x,y), then
the conditional probability mass function of X, given that Y=y, is denoted pX|
Y(x|y) and given by
Note that if (y)=0 then for that value of Y the conditional pmf of X does not
exist.
Similarly, the conditional probability mass function of Y , given that X=x
,is denoted (y|x) and given by
Properties of Conditional PMF's
Conditional pmf's are valid pmf's. In other words, the conditional pmf for X,
given Y=y, for a fixed y, is a valid pmf satisfying the following:
Similarly, for a fixed x, we also have the following for the conditional pmf of Y,
given X=x:
In general, the conditional distribution of X given Y does not equal the conditional
distribution of Y given X, i.e.
If X and Y are independent, discrete random variables, then the following
are true
We now turn to the continuous setting. Note that definitions and results in the discrete setting
transfer to the continuous setting by simply replacing sums with integrals and pmf's with pdf's.
The following definition gives the formulas for conditional distributions and expectations of
continuous random variables.
If X and Y are continuous random variables with joint pdf given by f(x,y), then
the conditional probability density function (pdf) of X, given that Y=y, is denoted fX|
Y(x|y) and given by
In general, the conditional distribution of X given Y does not equal the conditional distribution
of Y given X, i.e.,
If X and Y are independent, discrete random variables, then the following are true:
In other words, if X and Y are independent, then knowing the value of one random variable does not
affect the probability of the other one
Thanks!