0% found this document useful (0 votes)
22 views27 pages

Probability Mammadli Ilgar

The document discusses two-dimensional random variables, their conditional distributions and density functions. It begins with an introduction to two-dimensional random variables as pairs of random variables that are jointly distributed. It then covers types of two-dimensional random variables including continuous and discrete. It discusses probability mass functions and density functions for two-dimensional random variables. It also covers conditional probability distributions, independence of random variables, and properties of conditional distributions and densities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views27 pages

Probability Mammadli Ilgar

The document discusses two-dimensional random variables, their conditional distributions and density functions. It begins with an introduction to two-dimensional random variables as pairs of random variables that are jointly distributed. It then covers types of two-dimensional random variables including continuous and discrete. It discusses probability mass functions and density functions for two-dimensional random variables. It also covers conditional probability distributions, independence of random variables, and properties of conditional distributions and densities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

Faculty:SABAH

Subject:Probability Theory
Topic:Two-dimensional random quantities,their
conditional distribution and density functions
Student:Mammadli Ilgar
Teacher:Azizova Rugiya
Table of contents
Probability density
01 Introduction 04 functions (PDFs)

Conditional Probability
02 Types of random variables 05 Distributions

03 Probability mass functions


(PMFs)
What are two-dimensional random
variables
A two-dimensional random variable is a pair of random variables that
are jointly distributed. This means that their values are related to each
other in some way. For example, the height and weight of a person are
two-dimensional random variables, since they are both related to the
person's overall size.
Two-dimensional random variables are often denoted by (X, Y), where
X and Y are the two individual random variables. The probability
distribution of a two-dimensional random variable is described by its
joint probability mass function (PMF) or joint probability density
function (PDF), depending on whether the variables are discrete or
continuous.
Definition: Let S be a sample space associated
with a random experiment E. Let X and Y be two
random variables defined on S. Then the pair
(X,Y) is called a Two – dimensional random
variable. The value of (X,Y) at a point s is given by
the ordered pair of real numbers (X(s), Y(s)) = (x,
y) where X(s) = x, Y(s) = y.
Types of random variables

1.Continuous two-dimensional R.V.’s

Two – Dimensional continuous random variable: If


(X,Y) can assume all values in a specified region R in
XY plane (X,Y) is called a two-dimensional continuous
random variable.
2.Discrete two-dimensional random variables:

If the possible values of (X,Y) are finite or countably


infinite, then (X,Y) is called a two-dimensional
discrete random variable. When (X,Y) is a two-
dimensional discrete random variable the possible
values of (X,Y) may be represented as (xi, yj), i= 1,
2, 3, …n, j = 1, 2, 3, …m
Example: 1 Consider the experiment of tossing a coin twice. The sample
space is S = {HH, HT, TH, TT}. Let X denotes the number of heads
obtained in the first toss and Y denote the number of heads in the
second toss. Then

(X, Y) is a two-dimensional random variable or bi-variate random


variable. The range space of (X, Y) is {(1,1), (1,0), (0,1), (0,0)} which is
finite and so (X, Y) is a two-dimensional discrete random variables.
What are the Probability mass functions (PMFs)
The probability mass function (pmf) characterizes the distribution of a discrete random
varible. It associates to any given number the probability that the random variable will be
equal to that number.
Definition:In formal terms, the probability mass function of a discrete random variable
X is a function such that

where is the probability that the realization of the random variable X will be equal to x.
The probabilities associated with all (hypothetical) values must be non-negative and sum
up to 1,

And ≥0
Thinking of probability as mass helps to avoid mistakes since the physical mass is
conserved as is the total probability for all hypothetical outcomes.
Example:Suppose a random variable X can take only three
values (1, 2 and 3), each with equal probability. Its probability
mass function is

So, for example,


that is, the probability that X will be equal to 2 is 1/3.Or

that is, the probability that X will be equal to 3/5 is equal to 0 .


Joint probability mass function
Remember that for a discrete random variable X, we define the PMF as ()=P(X=) Now,
if we have two random variables X and Y, and we would like to study them jointly, we
define the joint probability mass function as follows:
The joint probability mass function of two discrete random variables X and Y is defined
as
(,y)=P(X=x,Y= )
Note that as usual, the comma means "and," so we can write
(x,y)= P(X=x,Y= )=P((X= ) and (Y= ))
We can define the joint range for X and Y as
={(x,y)| (x,y)>0}
Marginal probability mass function
The joint PMF contains all the information regarding the distributions
of X and Y. This means that, for example, we can obtain PMF of X from its
joint PMF with Y. Indeed, we can write

Here, we call the marginal PMF of X. Similarly, we can find the marginal
PMF of Y as
Independence of discrete random variables
Definition: Two random variables X and Y are said to be independent if and only if

for any couple of events and , where and .


In other words, two random variables are independent if and only if the events
related to those random variables are independent events.
The independence between two random variables is also called statistical
independence.
What are the probability density functions (PDFs)
The Probability Density Function(PDF) defines the probability function
representing the density of a continuous random variable lying between a
specific range of values. In other words, the probability density function
produces the likelihood of values of the continuous random
variable. Sometimes it is also called a probability distribution function or just
a probability function. However, this function is stated in many other sources
as the function over a broad set of values. Often it is referred to as cumulative
distribution function or sometimes as probability mass function(PMF).
However, the actual truth is PDF (probability density function ) is defined for
continuous random variables, whereas PMF (probability mass function) is
defined for discrete random variables.
Probability Density Function Formula
In the case of a continuous random variable, the probability
taken by X on some given value x is always 0. In this case, if
we find P(X = x), it does not work. Instead of this, we must
calculate the probability of X lying in an interval (a, b). Now,
we have to figure it for P(a< X< b), and we can calculate this
using the formula of PDF. The Probability density function
formula is given as,
PP
This is because, when X is continuous, we can ignore the
endpoints of intervals while finding probabilities of continuous
random variables. That means, for any constants a and b,
P(a ≤ X ≤ b) = P(a < X ≤ b) = P(a ≤ X < b) = P(a < X < b).
If continuous random variables X and Y are defined on the
same sample space S, then their joint probability density
function (joint pdf) is a piecewise continuous function,
denoted f(x,y) that satisfies the following.
Suppose that continuous random variables X and Y have joint density function f(x,y).
The marginal pdf's of X and Y are respectively given by the following.
Independence of discrete random variables

Two random variables X and Y , forming a continuous random


vector, are independent if and only if

where is their joint probability density function and and are their marginal
probability density functions.
Conditional Probability Distributions

In probability theory, a conditional probability distribution describes the


probability of a specific event occurring given that another event has already
happened. This concept is crucial for analyzing the relationship between two
random variables and understanding their dependence or independence.
Types of Conditional Probability Distributions:
• Discrete Conditional Distribution: For discrete random variables, it
specifies the probability of each possible value of Y for a given value of
X.
• Continuous Conditional Distribution: For continuous random variables, it
describes the probability density of Y for a given value of X.
Conditional Distributions of Discrete Random Variables
Recall the definition of conditional probability for events : the conditional probability
of A given B is equal to

We use this same concept for events to define conditional probabilities for random
variables.
If X and Y are discrete random variables with joint pmf given by p(x,y), then
the conditional probability mass function of X, given that Y=y, is denoted pX|
Y(x|y) and given by
Note that if (y)=0 then for that value of Y the conditional pmf of X does not
exist.
Similarly, the conditional probability mass function of Y , given that X=x
,is denoted (y|x) and given by
Properties of Conditional PMF's
Conditional pmf's are valid pmf's. In other words, the conditional pmf for X,
given Y=y, for a fixed y, is a valid pmf satisfying the following:

Similarly, for a fixed x, we also have the following for the conditional pmf of Y,
given X=x:

In general, the conditional distribution of X given Y does not equal the conditional
distribution of Y given X, i.e.
If X and Y are independent, discrete random variables, then the following
are true

In other words, if X and Y are independent, then knowing the value


of one random variable does not affect the probability of the other
one.
Conditional Distributions of Continuous Random Variables

We now turn to the continuous setting. Note that definitions and results in the discrete setting
transfer to the continuous setting by simply replacing sums with integrals and pmf's with pdf's.
The following definition gives the formulas for conditional distributions and expectations of
continuous random variables.
If X and Y are continuous random variables with joint pdf given by f(x,y), then
the conditional probability density function (pdf) of X, given that Y=y, is denoted fX|
Y(x|y) and given by

The conditional expected value of X, given Y=y, is


and the conditional variance of X, given Y=y,
is

Similarly, we can define the conditional pdf, expected value, and


variance of Y, given X=x, by swapping the roles of X and Y in the
above.
Properties of Conditional PDF's
Conditional pdf's are valid pdf's. In other words, the conditional pdf for X, given Y=y,
for a fixed y, is a valid pdf satisfying the following:

In general, the conditional distribution of X given Y does not equal the conditional distribution
of Y given X, i.e.,

If X and Y are independent, discrete random variables, then the following are true:

In other words, if X and Y are independent, then knowing the value of one random variable does not
affect the probability of the other one
Thanks!

CREDITS: This presentation template was created by Slidesgo, and includes


icons by Flaticon, and infographics & images by Freepik

You might also like