HW 2 Sol
HW 2 Sol
1. Juror’s fallacy. Suppose that P (A|B) ≥ P (A) and P (A|C) ≥ P (A). Is it always true that
P (A|B, C) ≥ P (A) ? Prove or provide a counterexample.
Solution: The answer is no. There are many counterexamples that can be given. For
example, suppose a fair die is thrown and let X denote the number of dots. Let A be the
event that X = 3 or 6; let B be the event that X = 3 or 5; and let C be the event that X = 5
or 6. Then, we have
P (A) = 1/3, P (A|B) = P (A|C) = 1/2, but P (A|B, C) = 0.
Apparently, having two positive evidences does not necessarily lead to a stronger evidence.
2. Polya’s urn. Suppose we have an urn containing one red ball and one blue ball. We draw a
ball at random from the urn. If it is red, we put the drawn ball plus another red ball into the
urn. If it is blue, we put the drawn ball plus another blue ball into the urn. We then repeat
this process. At the n-th stage, we draw a ball at random from the urn with n + 1 balls, note
its color, and put the drawn ball plus another ball of the same color into the urn.
1
(d) Let N denote the number of red balls in the first three draws. From part (c), we know
that P{N = 3} = 1/4 = P{N = 0}, where the latter identity follows by symmetry. Also
we have P{N = 2} = P{N = 1}. Thus, P{N = 2} must be 1/4.
Alternatively, we have
3. Probabilities from a cdf. Let X be a random variable with the cdf shown below.
F (x)
2/3
1/3
1 2
3x
1 2 3 4
x
(a) {X = 2}.
(b) {X < 2}.
(c) {X = 2} ∪ {0.5 ≤ X ≤ 1.5}.
(d) {X = 2} ∪ {0.5 ≤ X ≤ 3}.
Solution:
2
(c) since {X = 2} and {0.5 ≤ X ≤ 1.5} are two disjoint events,
(d) We have
4. Gaussian probabilities. Let X ∼ N (1000, 400). Express the following in terms of the Q
function.
X−µ
Solution: Using the fact that σ ∼ N (0, 1), thus F (x) = Φ( x−µ x−µ
σ ) = 1 − Q( σ ).
(a) We have
0 − 1000 1020 − 1000
P{0 < X < 1020} = Q −Q = Q(−50) − Q(1).
20 20
(b) We have
3
(c) Find P{|X| + |X − 3| ≤ 3} .
(d) Find P{X ≥ 0 | X ≤ 1} .
Solution:
(a) We have
x 1 x
1 −|u|
Z
FX (x) = e du = 2e , if x < 0
−∞ 2 1 − 12 e−x , if x ≥ 0.
0.9
0.8
0.7
0.6
F (x)
0.5
X
0.4
0.3
0.2
0.1
0
−6 −4 −2 0 2 4 6
x
Figure 1: cdf of X
(b) We have
P{|X| ≤ 2 or X ≥ 0} = P{X ≥ −2}
= 1 − P{X < −2}
Z −2
1 −|x|
=1− e dx
−∞ 2
1
= 1 − e−2 .
2
(c) We have
P{|X| + |X − 3| ≤ 3} = P {0 ≤ X ≤ 3}
Z 3
1 −|x|
= e dx
0 2
1 1
= − e−3 .
2 2
(d) We have
1
P{0 ≤ X ≤ 1} FX (1) − FX (0− ) − e−1
P{X ≥ 0 | X ≤ 1} = = = 2 1 −1 .
P{X ≤ 1} FX (1) 1 − 2e
4
6. Distance to the nearest star. Let the random variable N be the number of stars in a region
of space of volume V . Assume that N is a Poisson r.v. with pmf
e−ρV (ρV )n
pN (n) = , for n = 0, 1, 2, . . . ,
n!
where ρ is the ”density” of stars in space. We choose an arbitrary point in space and define
the random variable X to be the distance from the chosen point to the nearest star. Find the
pdf of X (in terms of ρ).
Solution: The trick in this problem, as in many others, is to find a way to connect events
regarding X with events regarding N . In our case, for x ≥ 0:
FX (x) = P{X ≤ x}
= 1 − P{X > x}
= 1 − P{No stars within distance x}
= 1 − P{N = 0 in sphere centered at origin of radius x}
4 3
= 1 − e−ρ 3 πx .
7. Lognormal distribution. Let X ∼ N (0, σ 2 ). Find the pdf of Y = eX (known as the lognormal
pdf).
8. Random phase signal. Let Y (t) = sin(ωt + Θ) be a sinusoidal signal with random phase
Θ ∼ U [−π, π]. Find the pdf of the random variable Y (t) (assume here that both t and the
radial frequency ω are constant). Comment on the dependence of the pdf of Y (t) on time t.
Solution: We can easily see (by plotting y vs. θ) that for y ∈ (−1, 1)
P (Y ≤ y) = P (sin(ωt + Θ) ≤ y)
= P (sin(Θ) ≤ y)
2 sin−1 (y) + π2
=
2π
sin−1 (y) 1
= + .
π 2
5
By differentiating with respect to y, we get
1
fY (y) = p .
π 1 − y2
Note that fY (y) does not depend on time t, i.e., is time invariant (or stationary) (more on
this later in the course).
9. Quantizer. Let X ∼ Exp(λ), i.e., an exponential random variable with parameter λ and
Y = ⌊X⌋, i.e., Y = k for k ≤ X < k + 1, k = 0, 1, 2, . . .. Find the pmf of Y . Define the
quantization error Z = X − Y . Find the pdf of Z.
pY (k) = P {Y = k}
= P {k ≤ X < k + 1}
= FX (k + 1) − FX (k)
= 1 − e−λ(k+1) − 1 − e−λk
= e−λk − e−λ(k+1)
= e−λk 1 − e−λ .
FZ (z) = P (Z ≤ z)
X∞
= P (k ≤ X ≤ k + z)
k=0
∞
X
= e−λk − e−λ(k+z)
k=0
1 − e−λz
= .
1 − e−λ
By differentiating with respect to z, we get
λe−λz
fZ (z) =
1 − e−λ
for 0 ≤ z < 1.
Refer to Figure 2 for a graphical explanation of the above.
10. Geometric with conditions. Let X be a geometric random variable with pmf
6
f X (x) z width bands that go from k+z to k+z+ z
a)
I II III IV V
x
0 1 2 3 4 5 6 7
z z+ z
p (k)
Y
Area of region I
b)
Area of II
Area of III
Area of IV
V
etc...
k
0 1 2 3 4 5 6 7
c)
z
0 1 2 3 4 5 6 7
z z+ z
7
(a) A = {X > m} where m is a positive integer.
(b) A = {X < m}.
(c) A = {X is an even number}.
Solution:
(a) We have
∞
X
P (A) = p(1 − p)n−1
n=m+1
X∞
= p(1 − p)n+m
n=0
∞
X
m
= p(1 − p) (1 − p)n
n=0
m
= (1 − p) .
(b) We have
m−2
X
P (A) = p(1 − p)n
n=0
1 − (1 − p)m−1
=p
1 − (1 − p)
= 1 − (1 − p)m−1 .
8
(c) We have
X
P (A) = p(1 − p)n−1
n even
X∞
′
= p(1 − p)((1 − p)2 )n
n′ =0
p(1 − p)
=
1 − (1 − p)2
1−p
= .
2−p
Plots are shown in Figure 3. The shape of the conditional pmf in part (a) shows that the
geometric random variable is memoryless:
Note that in all three parts pX (x) is defined for all x. This is required.
9
(a)
0.4
pX(x|A)
0.3
0.2
0.1
0
0 2 4 6 8 10 12
(b)
0.4
pX(x|A)
0.3
0.2
0.1
0
0 2 4 6 8 10 12
(c)
0.4
pX(x|A)
0.3
0.2
0.1
0
0 2 4 6 8 10 12
x
1
Figure 3: Plots of the conditional pmf’s using p = 4 and m = 5.
10