(Probability2023) Chapter4
(Probability2023) Chapter4
Example 4.2. John Slow is driving from Boston to the New York area, a distance of
180 miles at a constant speed, whose value is uniformly distributed between 30 and
60 miles per hour. What is the PDF of the duration of the trip?
3
Example 4.3. Let , where is a random variable with known PDF.
4
The Linear Case
5
Example 4.4. A Linear Function of an Exponential Random Variable.
Suppose that X is an exponential random variable with PDF
6
The Monotonic Case
• The calculation and the formula for the linear case can be generalized to the
case where is a monotonic function.
7
8
Example 4.2. (continued) John Slow is driving from Boston to the New York area, a
distance of 180 miles at a constant speed, whose value is uniformly distributed
between 30 and 60 miles per hour. What is the PDF of the duration of the trip?
9
Example 4.6. Let where is a continuous uniform random variable
on the interval . Derive the PDF of .
10
Functions of Two Random Variables
Example 4.7. Two archers shoot at a target. The distance of each shot from the
center of the target is uniformly distributed from 0 to 1 , independent of the other
shot. What is the PDF of the distance of the losing shot from the center?
11
Example 4.8. Let and be independent random variables that are uniformly
distributed on the interval [0,1]. What is the PDF of the random variable ?
12
Example 4.9. Romeo and Juliet have a date at a given time, and each, independently,
will be late by an amount of time that is exponentially distributed with parameter .
What is the PDF of the difference between their times of arrival? Let us denote by
and the amounts by which Romeo and Juliet are late, respectively. Find the PDF of
.
13
Sums of Independent Random Variables-Convolution
Consider , where and are independent.
• For discrete and , convolution
14
Example 4.10. The random variables and are independent and uniformly
distributed in the interval [0,1]. Derive the PDF of .
15
Example 4.11. The Sum of Two Independent Normal Random Variable is Normal.
Let and be independent normal random variables with means and
variances , respectively, and let Derive the PDF of .
16
Example 4.12. Difference of Two Independent Random Variables, and .
17
4.2. Covariance and Correlation
• Covariance of two random variables and :
18
4.2. Covariance and Correlation
Properties for any random variables and , and any scalars and , we have
•
19
Example 4.13. The pair of random variables takes the values (1,0), (0,1), (-1,0),
and (0,-1), each with probability ¼. Calculate the covariance and determine whether
and are uncorrelated or independent.
20
4.2. Covariance and Correlation
• Definition of correlation coefficient of random variables and that have
nonzero variances:
• Properties
• If and “tend” to have the same sign.
• If and “tend” to have the opposite sign.
• (or if only if there exists a positive (or negative, respectively)
constant such that
21
Example 4.14. Consider independent tosses of a coin with probability of a head
equal to . Let and be the numbers of heads and of tails, respectively, and let
us look at the correlation coefficient of and .
22
Variance of Sum of Random Variables
If are random variables with finite variance, we have
More generally,
{( , )| }
23
Example 4.15. Consider the hat problem discussed in Section 2.5, where people
throw their hats in a box and then pick a hat at random. Let us find the variance
of , the number of people who pick their own hat. We have ,
where is the random variable that takes the value if the th person selects
his/her own hat, and takes the value otherwise. Noting that is Bernoulli with
parameter
24
25
4.3. Conditional Expectation and Variance Revisited
• Total expectation theorem
26
Example 4.16. We are given a biased coin and we are told that because of
manufacturing defects, the probability of heads, denoted by , is itself random,
with a known distribution over the interval . We toss the coin a fixed number
of times, and we let be the number of heads obtained. Calculate .
27
Example 4.17. We start with a stick of length . We break it at a point which
is chosen randomly and uniformly over its length. and keep the piece that contains
the left end of the stick. We then repeat the same process on the piece that we
were left with. What is the expected length of the piece that we are left with after
breaking twice?
28
The Conditional Expectation as an Estimator
• Estimator of given :
29
The conditional Variance
30
Example 4.16. (continued) We are given a biased coin whose probability of heads,
, is uniformly distributed over the interval . With being the number of heads
obtained, we have and . Calculate
31
32
Calculate and .
33
34
4.4. Transforms
35
36
The transform is a function of a parameter .
37
From Transforms to Moments
38
Calculate and .
39
Example. For an exponential random variable with the transform ,
calculate and .
Other Properties
•
• If takes only nonnegative integer, →
40
Inversion of Transforms
Example 4.28. We are told that the transform associated with a random variable
is . Calculate the PMF of .
41
Example 4.29. The Transform Associated with a Geometric Random Variable. We
are told that the transform associated with a random variable is of the form
42
Example 4.30. Example 4.30. The Transform Associated with a Mixture of Two
Distributions. The neighborhood bank has three tellers, two of them fast, one slow.
The time to assist a customer is exponentially distributed with parameter at
the fast tellers, and at the slow teller. Jane enters the bank and chooses a
teller at random, each one with probability 1/3. Find the PDF of the time it takes
to assist Jane and the associated transform.
43
• Generally, let be a continuous random variables with PDFs .
The value of a random variable is generated as follows: an index is
chosen with a corresponding probability , and is taken to be equal to the
value of .
45
Example 4.33. The Sum of Independent Normal Random Variables is
Normal. Let X and Y be independent normal random variables with means , ,
and variances , respectively, and let Z = X + Y .
46
47
48
49
Transforms Associated with Joint Distributions
• Consider n random variables related to the same experiment.
Let be scalar free parameters. The associated multivariate transform
is a function of these n parameters
⋯
,…, .
50
Sum of Random Number of Independent
Random Variables.
• Expectation
51
Sum of Random Number of Independent
Random Variables.
• Variance
52
Sum of Random Number of Independent
Random Variables.
• Transform
53
54
55
56
57