Conditional Probability Density Function (Conditional PDF) describes the probability distribution of a random variable given that another variable is known to have a specific value. In other words, it provides the likelihood of outcomes for one variable, conditional on the value of another.
Mathematically, for two continuous random variables X and Y, the conditional PDF of X given that Y = y is denoted as:
f_{X|Y}(x|y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}
Where:
- fX|Y(x|y) is the joint probability density function of X and Y.
- fY(y) is the marginal probability density function of Y, which is the probability distribution of Y alone.
Here,
- Marginal PDF: f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) dx, which represents the probability distribution of Y regardless of X.
- Conditional PDF: fX∣Y(x∣y) tells us how X is distributed when we know Y is y.
How to Calculate Conditional PDF?
To calculate the Conditional Probability Density Function (Conditional PDF), we use the relationship between the joint PDF and the marginal PDF and the following steps:
- Step 1: Find the joint PDF fX,Y(x,y). This represents the likelihood of both X and Y occurring simultaneously.
- Step 2: Find the marginal PDF fY(y) by integrating the joint PDF over x: f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) \, dx
- Step 3: Calculate the conditional PDF using the formula.
This gives the probability distribution of X given the value of Y=y.
Let’s assume that X and Y have the following joint PDF:
f_{X,Y}(x, y) = 6xy for 0 < x < 1 and 0 < y < 1
Step 1: Find the marginal PDF of Y:
f_Y(y) = \int_0^1 6xy \, dx = 6y \int_0^1 x \, dx = 6y \left[\frac{x^2}{2}\right]_0^1 = 3y
Step 2: Calculate the conditional PDF of X given Y=y:
f_{X|Y}(x|y) = \frac{f_{X,Y}(x, y)}{f_Y(y)} = \frac{6xy}{3y} = 2x \quad \text{for} \quad 0 < x < 1
Thus, the conditional PDF of X given Y = y is:
f_{X|Y}(x|y) = 2x, \quad 0 < x < 1.
This is how you calculate the conditional PDF.
Properties of Conditional PDF
Conditional Probability Density Function (Conditional PDF) has several important properties, which are useful in understanding how conditional distributions behave in probability theory and statistics. Here are the key properties:
Non-Negativity
The conditional PDF must always be non-negative:
f_{X|Y}(x|y) \geq 0 \quad \text{for all} \quad x, y.
This follows from the fact that probability density functions cannot be negative.
Normalization
The conditional PDF must integrate to 1 with respect to x, given a specific value of y. In other words:
\int_{-\infty}^{\infty} f_{X|Y}(x|y) \, dx = 1 for each fixed y
This ensures that the conditional probability of X given Y = y is a valid probability distribution.
Conditional Expectation
The conditional expectation of X given Y = y can be computed as:
\mathbb{E}[X | Y = y] = \int_{-\infty}^{\infty} x f_{X|Y}(x|y) \, dx
This is the expected value of X when Y is known to be y.
Conditional Independence
Two random variables X and Y are conditionally independent given a third random variable Z if:
f_{X,Y|Z}(x, y | z) = f_{X|Z}(x|z) f_{Y|Z}(y|z)
In other words, knowing Z makes X and Y independent. This property is fundamental in areas like graphical models and Bayesian networks.
Marginalization of Conditional PDF
To obtain the marginal PDF of X, you can integrate out the conditional PDF over the values of Y:
f_X(x) = \int_{-\infty}^{\infty} f_{X|Y}(x|y) f_Y(y) \, dy
This shows how the marginal PDF of X can be recovered from the conditional PDF and the marginal PDF of Y.
Conditional CDF
The conditional cumulative distribution function (CDF) of X given Y = y is related to the conditional PDF by:
F_{X|Y}(x|y) = \int_{-\infty}^{x} f_{X|Y}(t|y) \, dt.
This gives the probability that X is less than or equal to X, given that Y = y
Read More,
Similar Reads
Conditional Probability Conditional probability defines the probability of an event occurring based on a given condition or prior knowledge of another event. Conditional probability is the likelihood of an event occurring, given that another event has already occurred. In probability, this is denoted as A given B, expresse
12 min read
Conditional Probability Conditional probability defines the probability of an event occurring based on a given condition or prior knowledge of another event. Conditional probability is the likelihood of an event occurring, given that another event has already occurred. In probability, this is denoted as A given B, expresse
12 min read
Conditional Probability Conditional probability defines the probability of an event occurring based on a given condition or prior knowledge of another event. Conditional probability is the likelihood of an event occurring, given that another event has already occurred. In probability, this is denoted as A given B, expresse
12 min read
Conditional Probability Conditional probability defines the probability of an event occurring based on a given condition or prior knowledge of another event. Conditional probability is the likelihood of an event occurring, given that another event has already occurred. In probability, this is denoted as A given B, expresse
12 min read
Conditional Convergence Conditional convergence is convergence with a condition that is a series is said to be conditionally convergent if it converges, but not absolutely. This means that while the series âanâ converges, the series of the absolute values ââ£an⣠diverges.A classic example of a conditionally convergent serie
5 min read
Conditional Probability vs Bayes Theorem Conditional probability and Bayes' Theorem are two imprtant concepts in probability where Bayes theorem is generalized version of conditional probability. Conditional probability is the probability of an event occurring given that another event has already occurred. Bayes' Theorem, named after the 1
5 min read