Wχ f F F
Wχ f F F
Problem #1
2
W 2= Gamma (1, 1/2). Find the density f n , function distribution Fn , and the
(a) If
F1
W
explicitly.
Solution:
If
( 12 ) exp( w2 ) for w 0.
W Exponential (1/2). So we can compute the distribution function as
FW ( w )=1exp
( w2 )
for w 0 .
Then,
exp
( 2y )=1w
y=F1
W ( w)
. Hence,
1
FW ( w )=2 log( 1w)
=tan1
( YX )
R2 22 and Uniform ( 0, 2 )
.
Solution:
Since ( X , Y ) N ( 0, I ) , then the joint distribution is given by
f X ,Y ( x , y )=
Now for
1
x 2 + y 2
exp
for ( x , y )R2
2
2
x=rcos ( ) and
y=rsin() with
+
r R and (0, 2 )
x +y
1 y
)= ( r , )
The inverse function (, tan
x
1
f X ,Y ( x , y ) : ( x , y )
()
exp
r+
exp
r
2
2
2
2
( )
( )
1
r 2
exp
r
( )
r 2
r exp
on( 0, ) ( 0,2 )
2
( )
So,
Solution:
If
get
UV
are independent Uniform (0, 1) random variables, we can use the inverse transformation to
2
2
and
2 V
Problem #2
Suppose that
(a) Show
M n min X i
1 i n
and
T n max X i
1 i n
Solution:
Consider
x
n
x
x
x
P X 1 > , X 2 > , , X n>
n
n
n
x
x
x
P X 2> P X n >
n
n
n
x n
x
= exp
n
n
P X1>
P X1>
) (
) (
) ( ( ))
( x)
exp
Therefore,
n M n d exponential( ) as required.
( 1 )T
T n
where
T n then,
( ()
P T n
) (
() )
1
1
log n x =P max X j x +
log n
1 j n
P X1 x +
P X1 x +
1
P X1 x +
log n
() )
exp (x )
n
( 1exp (xlogn ) ) = 1
1
(
( ) log n x)=P (T x )=exp (exp(x ) )
P T n
Hence,
F satisfying
F at 0 :
F ( x )F ( 0 )
'
=F ( 0 ) . Show that n M n d exponential ( F ' ( 0 )) .
x
lim
x 0
Solution:
Consider,
x
x
x
P X 1 > , X 2 > , , X n>
n
n
n
x
x
x
P X 2> P X n >
n
n
n
x
n
P X1>
nF
Because
) (
) (
x
n
()
n
P X1>
1 j n
x
n
F has a derivative
formula:
nF ( x /n ) =
( nx ) =F ( 0)
F ( x /n ) F ( 0 )
x=
x /n
1/n
'
Hence,
nF
1
x
n
()
n
n M n d exponential ( F ' ( 0 )) .
exp (F ' ( 0 ) x ) , as required
Problem #3
Suppose that
Solution:
E (Y | X )E ( Y )
[]
{ 2 } .
We want to show that,
2
E [ ( Y E (Y | X ) ) |X } + E
2
E ( Y E ( Y ) ) =E
By direct computation:
2
Var ( Y )=E [ Y E ( Y ) ] =E [ Y E (Y | X ) + E ( Y |X ) E (Y ) ]
( Y E ( Y | X ) ][ E ( Y | X )E ( Y ) ] + E [ E ( Y| X )E ( Y ) ]
2
E [ Y E ( Y |X ) ] +2 E
Now, because we have conditional expected value, we should expect 0 for the middle term. That is,
( Y E ( Y | X ) ][ E ( Y | X )E ( Y ) ]=E ( E ( [ Y E ( Y | X ) ][ E ( Y |X )E ( Y ) ]|X ) )
E
E ( [ E ( Y |X )E ( Y ) ] E ( [ Y E ( Y |X ) ]|X ) )
E ( [ E ( Y |X )E ( Y ) ] ( [ E ( Y | X )E ( Y X ) ] ) )
E ( [ E ( Y |X ) E (Y ) ] 0 ) =0
Hence,
E ( Var ( Y | X ) ) ) +Var ( E ( Y |X ) )
(b) Interpret (a) geometrically.
Solution:
From the textbook definition we can say that
Y E ( Y | X ) is orthogonal to
E ( Y |X ) E(Y ) in
L2 ( P ) probability space. The identity defined in problem (a) is interpreted as a Pythagorean Theorem
equality.
2
E ( Y )=E ( E ( Y |X ) ) and
(a).
Solution:
2
Suppose that ( Y |X ) 2 X +n
where
E ( Y )=E ( E ( Y |K ) ) =E ( 2 K + n )=n+ 2
Poisson ( / 2) . Then,
( 2 )=n+
E ( 2 )=2(2 K +n)+Var (2 K + n)
( 2 )+ 2n+ 4 ( 2 )=4 +2 n
2
n ( )( n+ )
d N (0,1) as either n or .
2n+ 4
Solution:
Suppose
1
2
2n ( )( n+ ) ( ( Z 1+ ) ( 1+ ) + ( Z 21 ) + + ( Z n1 ) )
( 2n+ 4 )
2n+ 4 d
2
( Z 1+ ) ++(Z 2n 1) 2 Z 1
d
+
( 2n+ 4 )
2n+ 4
T 1 ++T n
2 Z1
+
( 2 n+ 4 ) 2n+ 4
Thus,
2 Z1
p 0 and
2 n+ 4
T 1 ++T n
d N ( 0,1) as n .
( 2 n)
n ( )( n+ )
d N (0,1) as n .
2n+ 4
Hence,
Problem # 4
Give an example of random variables such that
Solution:
Consider the random variable
Then,
Clearly,
E X n n Pn=n
X n such that X n = n
0, else , with probability
1
1
=
2
n
n
( )
and
pn=1/n2
1
=1 for all n .
n2
( )