0% found this document useful (0 votes)
24 views

(Probability2023) Chapter4

Uploaded by

sf7pzk9yhn
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

(Probability2023) Chapter4

Uploaded by

sf7pzk9yhn
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Probability and Statistics

Chapter 4. Further Topics on Random Variables


Yoonseon Oh
[email protected]
4.1. Derived Distributions
Example 4.1. Let be uniform on [0,1]. Let .

Example 4.2. John Slow is driving from Boston to the New York area, a distance of
180 miles at a constant speed, whose value is uniformly distributed between 30 and
60 miles per hour. What is the PDF of the duration of the trip?

3
Example 4.3. Let , where is a random variable with known PDF.

4
The Linear Case

5
Example 4.4. A Linear Function of an Exponential Random Variable.
Suppose that X is an exponential random variable with PDF

Where is a positive parameter. Let . Derive the PDF of Y .

Example 4.5. A Linear Function of a Normal Random Variable is Normal.


Suppose that is a normal random variable with mean and variance ,
and let , where and are scalars, with . Derive the PDF of Y.

6
The Monotonic Case
• The calculation and the formula for the linear case can be generalized to the
case where is a monotonic function.

• Let be a continuous random variable and be a certain interval.


• 𝑥 for .
• Consider the random variable and assume that is strictly
monotonic over the interval and is differentiable.

• Definition of strictly monotonic


(a) Monotonically increasing

(b) Monotonically decreasing.

• A strictly monotonic function can be "inverted“.

7
8
Example 4.2. (continued) John Slow is driving from Boston to the New York area, a
distance of 180 miles at a constant speed, whose value is uniformly distributed
between 30 and 60 miles per hour. What is the PDF of the duration of the trip?

9
Example 4.6. Let where is a continuous uniform random variable
on the interval . Derive the PDF of .

10
Functions of Two Random Variables
Example 4.7. Two archers shoot at a target. The distance of each shot from the
center of the target is uniformly distributed from 0 to 1 , independent of the other
shot. What is the PDF of the distance of the losing shot from the center?

11
Example 4.8. Let and be independent random variables that are uniformly
distributed on the interval [0,1]. What is the PDF of the random variable ?

12
Example 4.9. Romeo and Juliet have a date at a given time, and each, independently,
will be late by an amount of time that is exponentially distributed with parameter .
What is the PDF of the difference between their times of arrival? Let us denote by
and the amounts by which Romeo and Juliet are late, respectively. Find the PDF of
.

13
Sums of Independent Random Variables-Convolution
Consider , where and are independent.
• For discrete and , convolution

• For continuous and , convolution

14
Example 4.10. The random variables and are independent and uniformly
distributed in the interval [0,1]. Derive the PDF of .

15
Example 4.11. The Sum of Two Independent Normal Random Variable is Normal.
Let and be independent normal random variables with means and
variances , respectively, and let Derive the PDF of .

16
Example 4.12. Difference of Two Independent Random Variables, and .

17
4.2. Covariance and Correlation
• Covariance of two random variables and :

• and are uncorrelated, when

• An alternative formula for the covariance: .

18
4.2. Covariance and Correlation
Properties for any random variables and , and any scalars and , we have

• If and are independent, (uncorrelated)

19
Example 4.13. The pair of random variables takes the values (1,0), (0,1), (-1,0),
and (0,-1), each with probability ¼. Calculate the covariance and determine whether
and are uncorrelated or independent.

20
4.2. Covariance and Correlation
• Definition of correlation coefficient of random variables and that have
nonzero variances:

• Properties
• If and “tend” to have the same sign.
• If and “tend” to have the opposite sign.
• (or if only if there exists a positive (or negative, respectively)
constant such that

21
Example 4.14. Consider independent tosses of a coin with probability of a head
equal to . Let and be the numbers of heads and of tails, respectively, and let
us look at the correlation coefficient of and .

22
Variance of Sum of Random Variables
If are random variables with finite variance, we have

More generally,

{( , )| }

23
Example 4.15. Consider the hat problem discussed in Section 2.5, where people
throw their hats in a box and then pick a hat at random. Let us find the variance
of , the number of people who pick their own hat. We have ,
where is the random variable that takes the value if the th person selects
his/her own hat, and takes the value otherwise. Noting that is Bernoulli with
parameter

24
25
4.3. Conditional Expectation and Variance Revisited
• Total expectation theorem

• Reformulation of the total expectation theorem


• Law of iterated expectation
• Law of total variance

• Introduce a random variable


When is given is a function of .

26
Example 4.16. We are given a biased coin and we are told that because of
manufacturing defects, the probability of heads, denoted by , is itself random,
with a known distribution over the interval . We toss the coin a fixed number
of times, and we let be the number of heads obtained. Calculate .

27
Example 4.17. We start with a stick of length . We break it at a point which
is chosen randomly and uniformly over its length. and keep the piece that contains
the left end of the stick. We then repeat the same process on the piece that we
were left with. What is the expected length of the piece that we are left with after
breaking twice?

• Important property for any function, .

28
The Conditional Expectation as an Estimator
• Estimator of given :

29
The conditional Variance

30
Example 4.16. (continued) We are given a biased coin whose probability of heads,
, is uniformly distributed over the interval . With being the number of heads
obtained, we have and . Calculate

31
32
Calculate and .

33
34
4.4. Transforms

• The transform (or the moment generating function) associated with a


random variable is a function of a scalar parameter

35
36
The transform is a function of a parameter .

Example 4.25. The Transform Associated with a Linear Function of a Random


Variable. Let be the transform associated with a random variable .
Consider a new random variable Calculate

Example 4.26. The Transform Associated with a Normal Random Variable.


Let X be a normal random variable with mean and variance . Calculate

37
From Transforms to Moments

38
Calculate and .

39
Example. For an exponential random variable with the transform ,
calculate and .

Other Properties

• If takes only nonnegative integer, →

40
Inversion of Transforms

Example 4.28. We are told that the transform associated with a random variable
is . Calculate the PMF of .

41
Example 4.29. The Transform Associated with a Geometric Random Variable. We
are told that the transform associated with a random variable is of the form

where is a constant in the range .

42
Example 4.30. Example 4.30. The Transform Associated with a Mixture of Two
Distributions. The neighborhood bank has three tellers, two of them fast, one slow.
The time to assist a customer is exponentially distributed with parameter at
the fast tellers, and at the slow teller. Jane enters the bank and chooses a
teller at random, each one with probability 1/3. Find the PDF of the time it takes
to assist Jane and the associated transform.

43
• Generally, let be a continuous random variables with PDFs .
The value of a random variable is generated as follows: an index is
chosen with a corresponding probability , and is taken to be equal to the
value of .

• Sums of independent random variables: Addition of independent random variables


corresponds to multiplication of transforms

• For independent random variables and ,


44
Example 4.31. The Transform Associated with the Binomial. Let be
independent Bernoulli random variables with a common parameter p. and The
random variable is binomial with parameters and . Calculate
the transform of .

Example 4.32. The Sum of Independent Poisson Random Variables is


Poisson. Let X and Y be independent Poisson random variables with means
and , respectively, and let Z = X + Y. Calculate the transform of .

45
Example 4.33. The Sum of Independent Normal Random Variables is
Normal. Let X and Y be independent normal random variables with means , ,
and variances , respectively, and let Z = X + Y .

46
47
48
49
Transforms Associated with Joint Distributions
• Consider n random variables related to the same experiment.
Let be scalar free parameters. The associated multivariate transform
is a function of these n parameters

,…, .

• Inversion property of transform holds.

50
Sum of Random Number of Independent
Random Variables.

• Expectation

51
Sum of Random Number of Independent
Random Variables.
• Variance

52
Sum of Random Number of Independent
Random Variables.
• Transform

53
54
55
56
57

You might also like