Assignment
Assignment
Quiz-l
20t9-2020-ll Semester
\*\od'el Sol-q'tdsnA
Problem No. 1:
ln a probability space (Q,T,P), let A, B and C be events such that P(A) =1,
p(B)=1, p(c)-!, P(AnD=1, P(Anc)=1, P(Bnc)=f ano
p(A n B n C) = 1. Let D be the event that exactly one the event among A,B and
C occurs, Derive the value of P(D).
S o0..lto"r
PtFnc )l
?tl) ?tnr+ I ($) + Itc)- ! L ttA^D)-t ItAna)+
= 5 hA9-ta
-r 3 ?tA^$Ac )
1-+
6o
$ flAFvs'
Ptu) z
Problem No. 2:
(0, x< 0
It
l4'
o'x<t
F(x) =
1t
I 3'
rsx<z
\1, x2z
P(1 <x <2), P(1 S x <2) and P(1 <
Derive the values of P(X e [0 ,L,}j),
x <2). 3+2+2+2 = 9 Marks
I
\t'Sol**to'. I
ltxe\glu't)' [[I>o)+lLrzt)+ttr'>)
Fut- rtr-t)1 ut- rtr-l-)
\ Pto\-P1o-rl * [ T Ft
ot i;*t F"F-t'sl
s\l> -2
L-| Fa.-1
ffi
F CF)
?ttg x <t) = F(lr)- =
Let F: IR + IR be defined
bY
0, x(0
x*4 03x<2
T,
x2+4b 2<x14
F(x) =
T'
x*c
'
43x36
T,
L, x>6
values of b and c so that F is
a
Find the
where b and c are real constants'
random variable'
distribution function of some
b) r
bz l-
brL ]-
---g|
L%J
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-V
Instructor: Neeraj Misra
1. For µ ∈ R and λ > 0, let Xµ,λ be a random variable having the p.d.f.
(
1 − x−µ
λ
e λ , if x ≥ µ
fµ,σ (x) = .
0, otherwise
(a) Find Cr (µ, λ) = E((X − µ)r ), r ∈ {1, 2, . . .} and µ0r (µ, λ) = E(Xµ,λ
r
), r ∈ {1, 2};
(b) For p ∈ (0, 1), find the p-th quantile ξp ≡ ξp (µ, λ) of the distribution of Xµ,λ
(Fµ,λ (ξp ) = p, where Fµ,λ is the distribution function of Xµ,λ );
(c) Find the lower quartile q1 (µ, λ), the median m(µ, λ) and the upper quartile
q3 (µ, λ) of the distribution of Xµ,λ ;
(e) Find the standard deviation σ(µ, λ), the mean deviation about median MD(m(µ, λ)),
the inter-quartile range IQR(µ, λ), the quartile deviation (or semi-inter-quartile
range) QD(µ, λ), the coefficient of quartile deviation CQD(µ, λ) and the coefficient
of variation CV(µ, λ) of the distribution of Xµ,λ ;
(f) Find the coefficient of skewness β1 (µ, λ) and the Yule coefficient of skewness
β2 (µ, λ) of the distribution of Xµ,λ ;
(h) Based on values of measures of skewness and the kurtosis of the distribution of
Xµ,λ , comment on the shape of fµ,σ .
2. Let X be a random variable with p.d.f. fX (x) that is symmetric about µ (∈ R),
i.e., fX (x + µ) = fX (µ − x), ∀ x ∈ (−∞, ∞).
(a) If q1 , m and q3 are respectively the lower quartile, the median and the upper
quartile of the distribution of X then show that µ = m = q1 +q2
3
;
q1 +q3
(b) If E(X) is finite then show that E(X) = µ = m = 2
.
3. A fair dice is rolled six times independently. Find the probability that on two
occasions we get an upper face with 2 or 3 dots.
1
4. Let n (≥ 2) and r ∈ {1, 2, , n − 1} be fixed integers and let p ∈ (0, 1) be a fixed real
number. Using probabilistic arguments show that
n n−1
X n j n−j
X n−1 j n−1−j n−1 r
p (1 − p) − p (1 − p) = p (1 − p)n−r .
j=r
j j=r
j r
5. Let X ∼ Bin(n, p), where n is a positive integer and p ∈ (0, 1). Find the mode of
the distribution of X.
6. Eighteen balls are placed at random in seven boxes that are labeled B1 , . . . , B7 .
Find the probability that boxes with labels B1 , B2 and B3 all together contain six
balls.
7. Let T be a discrete type random variable with support ST = {0, 1, 2, . . .}. Show that
T has the lack of memory property if, and only if, T ∼ Ge(p), for some p ∈ (0, 1).
8. A person repeatedly rolls a fair dice independently until an upper face with two or
three dots is observed twice. Find the probability that the person would require
eights rolls to achieve this.
11. A mathematician carries at all times two match boxes, one in his left pocket and
one in his right pocket. To begin with each match box contains n matches. Each
2
time the mathematician needs a match he is equally likely to take it from either
pocket. Consider the moment when the mathematician for the first time discovers
that one of the match boxes is empty. Find the probability that at that moment
the other box contains exactly k matches, where k ∈ {0, 1, . . . , n}.
14. Urn Ui (i = 1, 2) contains Ni balls out of which ri are red and Ni − ri are black.
A sample of n (1 ≤ n ≤ N1 ) balls is chosen at random (without replacement) from
urn U1 and all the balls in the selected sample are transferred to urn U2 . After the
transfer two balls are drawn at random from the urn U2 . Find the probability that
both the balls drawn from urn U2 are red.
15. Let X ∼ Po(λ), where λ > 0. Find the mode of the distribution of X.
3
18. A person has to open a lock whose key is lost among a set of N keys. Assume that
out of these N keys only one can open the lock. To open the lock the person tries
keys one by one by choosing, at each attempt, one of the keys at random from the
unattempted keys. The unsuccessful keys are not considered for future attempts.
Let Y denote the number of attempts the person will have to make to open the lock.
Show that Y ∼ U ({1, 2, . . . , N }) and hence find the mean and the variance of the
r.v. Y .
19. Let a > 0 be a real constant. A point X is chosen at random on the interval (0, a)
(i.e., X ∼ U (0, a)).
(a) If Y denotes the area of equilateral triangle having sides of length X, find the
mean and variance of Y ;
(b) If the point X divides the interval (0, a) into subintervals I1 = (0, X) and
I2 = [X, a), find the probability that the larger of these two subintervals is at least
the double the size of the smaller subinterval.
20. Using a random observation U ∼ U (0, 1), describe a method to generate a random
observation X from the distribution having
(a) probability density function
e−|x|
f (x) = , −∞ < x < ∞;
2
21. Let X ∼ U (0, θ), where θ is a positive integer. Let Y = X − [X], where [x] is the
largest integer ≤ x. Show that Y ∼ U (0, 1).
22. Let U ∼ U (0, 1) and let X be the root of the equation 3t2 − 2t3 − U = 0. Show that
X has p.d.f. f (x) = 6x(1 − x), if 0 ≤ x ≤ 1, = 0, otherwise.
23. (Relation between gamma and Poisson probabilities) For t > 0, θ > 0 and
n ∈ {1, 2, . . .}, using integration by parts, show that
∞ n−1 − t t j
e θ( )
Z
1 − xθ n−1
X
θ
n
e x dx = ,
θ Γ(n) t j=0
j!
4
24. Let Y be a random variable of continuous type with FY (0) = 0, where FY (·) is the
distribution function of Y . Show that Y has the lack of memory property if, and
only if, Y ∼ Exp(θ), for some θ > 0.
27. Let X = (X1 , X2 )0 ∼ N2 (µ1 , µ2 , σ12 , σ22 , ρ) and, for constants a1 , a2 , a3 and a4 (ai 6= 0,
i = 1, 2, 3, 4, a1 a4 6= a2 a3 ), let Y = a1 X1 + a2 X2 and Z = a3 X1 + a4 X2 .
(a) Find the joint p.d.f. of (Y, Z);
(b) Find the marginal p.d.f.s. of Y and Z.
28. (a) Let (X, Y )0 ∼ N2 (5, 8, 16, 9, 0.6). Find P ({5 < Y < 11}|{X = 2}), P ({4 < X <
6}) and P ({7 < Y < 9});
(b) Let (X, Y )0 ∼ N2 (5, 10, 1, 25, ρ), where ρ > 0. If P ({4 < Y < 16}|{X = 5}) =
0.954, determine ρ.
5
31. Let X = (X1 , X2 )0 have the joint p.d.f.
(
1 − 12 (x2 +y 2 )
π
e , if xy > 0
f (x, y) = .
0, otherwise
Show that marginals of f (·, ·) are each N (0, 1) but f (·, ·) is not the p.d.f. of a
bivariate normal distribution.
32. Let fr (·, ·), −1 < r < 1, denote the pdf of N2 (0, 0, 1, 1, r) and, for a fixed ρ ∈ (−1, 1),
let the random variable (X, Y ) have the joint p.d.f.
1
gρ (x, y) = [fρ (x, y) + f−ρ (x, y)].
2
Show that X and Y are normally distributed but the distribution of (X, Y ) is not
bivariate normal.
(b) Consider 7 independent casts of a pair of fair dice. Find the probability that
once we get a sum of 12 and twice we get a sum of 8.
n2
35. (a) Let X ∼ Fn1 ,n2 . Show that Y = n2 +n 1X
∼ Be( n21 , n22 );
(b) Let Z1 and Z2 be i.i.d. N (0, 1) random variables. Show that |ZZ12 | ∼ t1 and
Z1
Z2
∼ t1 (Cauchy distribution);
(c) Let X1 and X2 be i.i.d. Exp(θ) random variables, where θ > 0. Show that
Z=X 1
X2
has a F -distribution;
(d) Let X1 , X2 and X3 be independent random variables with Xi ∼ χ2ni , i = 1, 2, 3.
X1 /n1
Show that X 2 /n2
and (X1 +XX23)/(n
/n3
1 +n2 )
are independent F -variables.
36. Let X1 , . . . , Xn be a random sample from Exp(θ) distribution, where θ > 0. Let
X1:n ≤ · · · ≤ Xn:n denote the order statistics of X1 , . . . , Xn . Define Z1 = nX1:n , Zi =
(n − i + 1)(Xi:n − Xi−1:n ), i = 2, . . . , n. Show that Z1 , . . . , Zn are independent and
identically distributed Exp(θ) random variables. Hence find the mean and variance
of Xr:n , r = 1, . . . , n. Also, for 1 ≤ r < s ≤ n, find Cov(Xr:n , Xs:n ).
6
~~) ~,\~_,)\))~ C&.....J. \\\~>.)')v ~ \),·1\\v\\,...:kt ... .., ~ x,_.cr-- ~ ~"~~J A~c.w~
.::::.') ~ Q.-.1\ .)..-..'-)~V .-7.:..<'-1) ~:;:)\.. ~ 'YE!_I)
)'I~
\ ~"c ~ u""' ~'\) . tJ ~--~ -"-"" ~)l' >-~,....) -;. ~)1. \'-'_.._ '., -v )... ~ \~ ~ x-M 4- ~-..c.
Cta.) 'f.-l..l.. t ,_.._X. -:') ~\')(-U.~o) -;.-~Ut.-JI.._!:v) ~) f',.\.M)""t' ~"'lM-);: \
~ ~\.M.-') ~-\_ ~ ~\M) lf\\"'(.c. ~.,..\1-J') ~ f-,.\-\.1\)
:;.
~ \ y_..,.., 1 ~ v ) ~ t' . . . ";. ~) '"'t" ~ L i "'""" ~ v _, ) ~ l'\"' '"\)
--
\~v~~~t- \-\:.. {] ~ll \?--\-:
\ ;ot. ~ )..~) ) ') ~ )( \f.-\ (?)
\,.lt_ l'-"~ \"""" (~) {).vJ. (D\ \ ~~""'\ \._U(.t-- ~ ~c.cl ~~ ~ lX ;.lh·q} p-')
-:. ~ lx~ Lc....-+-,) 1' )
)~\
V~\~o) (~l""C?.\)
C.'r ~ ))'t\
)
~ \ ~ ~<... l.~ \ .
~~ ~ o v...kc-'-"""e -a\ bbA f v\1\"'-.)
~~vo~q~ H ~ . P] ~ -e_.,.<....A.-
_
::.._...;,
~UK>
~ 1
u .
't:>V - r ....vu
! o:Afl oJ'J f\v...<..!_(J./} Cu.. .A C~I\ C:Vvt i
~ u..\>~ v.. \ c..<<2 ~~""" ~c.u ~~\)~ . T~ H S);. {_ .
D...t.A> o -\1..< v o ~ Lo""-'\ &-a. M :J
'f... ";- \\- ~ \ ~ ~I.A." &I) ~ ¥ (. c..e...l.{~ -}(..l ):: ~ f\ '< c. c <M
rJ ~ ~ l l-/ ~ )
l~)
~\ ';. ~ ~ ~~U\ ~'-'"'\ 1\ ~0 Q_..wc_ 4- ~l6,.J ~ .A<t:..v..vc. S.l(\..W L'"'I
~ \....J. " ' ~ !(.,- 5 \... \\. ~ lt 1.- 5 ) \1\ ~ "'""'"""'\.,~ e\ ~ """" )... . v '-1
\)\IH.-<.J.l""{ ~ $"~ j\.I..I...<.C~ \~v 'f~- />.. \_\<.~--~') ~ ~
1\~...t<..C..~ ~..., ~ -\(..-.'""" . ..... (.,,n'"' . T\...... ""'' N "'~I.. s-... () -~ )
~ "\1. "'-> \---\ "'- s-__ c . )) .
\) ~ ')(, -;. ? ) -..... ~ \ \ '?-)) -- c~) lO ·";t ,s- \_ \- ~-~)~ ~ . \)~'?
\>l~1. ";- ~) -;.. ~l'·h ~ ~') ~ \~ )~o ·)))\.\:-l:l · >f~o . o1-\L
{\.~,v~ ~v·'!. ::. V<.Y-t-;-2) ~~\lt.." ;; 9): 0 · \~~ l~v~o).. , ,
\Jvb~Qe,A ~'.). \~ c-.."'-"~" ().. l\~if'\€-""CL ~ t .... AL.t<.c..t'*~""" \).tZv"-'"",U\ ~" < .,J..}) W~~
)v•\~.:\ ~ ~1 ~ {\ ..... C.C(~.I\ ~"" '{.".C.I.. kvt j \:,(i. ....) P. T\...-.
L"" ~~) \~ \. '-\)"""~ -;- Vl A.~ ~eo.J\t Y f\"<C(Al\!.h {...,. "' ~ (V\'\Q ~' -\"' ~ )
)..~v
~-"
- ~l L ~ Y~ 1\v.cc('AJ\ t:.._ \_'l--<~)-~ -\v,·.J 't )
~:::-;l
-,
'!.~ ~ e.~~-4'-' .-\- [\\.._;_;:, +~
~,j
(:-\ \i l"-~-:~ l ) ;
;. t-"-"\ ~-\' V-) ) \-1"\ "C \\"TV -':! ~\) ~ , - \-- "\"'t ~~v ~ \ )
;- c- ~~·-i)
Li
~ c. \l. -;
....
L_
)/-;J
~~-0
"' ~- "'-t ~~v - )
) ;... l ~~:' ) .
\~vo~h~,Aoo. uo. \\] l~ \V'\ ~ ~ ..e...ve."')t ~ C~.:.<:>f\\."') 'b-:.~ \ oV) 1\v...c...t.c_AI)
()....,..~ ~~\ b~ C.\...o<)J'\\v-~ t<)'JI. 1- aJ') \ AA. .l.""~ . T\..- '-'~
t_A.II/Q... "- 1\LJ\,j'~c..e. ~ ~v-.l.<l...,'-')..(."'.1( t(.v~.. ~1H ...\vt~ 1.'\~ ~v... \,lt.6\t\'A)
\ f\......C..C<Zt-1\ t...... <..c:..c.t.... .>..."~ --.1. 1e. \. .... ) ~ .
"'-;. ~. -
' 'l-- ' • , -, ""'
I
G-
~
....J
,, ,,
>< ,...._
r,.
-;:" ~
v-
>-
} r ?1
c
- rr
..-..:::J
-
"<
D
~....
"j,
--
""'r'
~
\.c)
(
r:,
- f
I- J
r -o
p 6
if
~
1
r:!
I
c> ~
0
V"~
-.:;
s
CJ
l
\I
................$
i"
__..,.
,. \I
"".J
1./
- "<
~ -r
-<:::J
y ~
}
.._,
"J
_;....
_...
.....
i'
--
<:'"
._
,..
-.,..:::>
11
s:-;
__.,
0
..__/
I~'
)r r
~
:>
!.
,.
?
.~
'l ,, " 0
,.
> ()
~
f
5
v
~
<
---r ~~?
s ,-,
~
..-- 1-"
~
'r' Y
1/\
~ ,,'[_ c::r-
~
\' 'I
.../
'1
t
~s-15 ~f1~
-
;?
-:I
}-
~
<
- ,, J-
y
}' _,
t"
fO
~
r ~
r·
,,
\ I
f,"M 1 u
'-'
:;-
(>
'ltV) ,.~{\1"" ..> ~ ft f 0
l I
f:r \s-
-
rJ lr -a
1;= \?
\ -
~ \ ~
\r '( ~ IT
rf'-'11.\-" ..-- ;$?
.., r/J
f
IT
I '--"
._./
,~ b J j ~ ~ J--~ d- ? -
.;.._ J ,,. r :3-r.~
~ -"G-o 7
~r
9-- (} ..-
f s
, J:>
~y 7 l ?,. ?,... "
1..1')
~ V" ~ l t cJ \p 1S J .1 y .._, -:J
~
>-~
,..--
~ 'C.,..-
-"
....
.... d'"" ,. t )- F)-
j
f
I
~
.._,..-;r!.
-- L
}
-
..,
z.,....
~
.../
"""7
,.Y'
-.,,. /\- .r-r
:r t:..\5> ~
::.... .).
rl~
«"
5).
t ~
~
<. ?'
J
,.-.
,.........
,. . . . ,';
IT"
'Y
~ 1
r ~
~ !j
.\
-c
>'
,? )?
.{ "?
... ,.. -
-o r
~
~r
-. ~
~
I(' }. r }- r)5=> "]: -P
~9
\ 0
,..--..
t ,f
c:r1 ~
'-
' ~ ~ , z ).P
l ><:. T r' ;:>
,} ;
t.
< ,... ,9 --1- f v
~ ):f
11
,..... Vl ~\~
,, ~1(
lr
,. lf ~ f ..,..
o Z\~
<::r , .... ~ ..-
to'
~ ~ -
v IT ~ ..r-..../ \
y \\ t. ,--- 'ci -,,
~
_r ~ -
'.3:'
_,
...
<
c
~
Ei
.......
,,
-.:::' ~)9
l_,
'--"
~'~
\$
r
-'
..\: ,--
v-
\
.\
....
1
'"
s
.._,
~~~
.._/
...__.., I ~ 1
{t:.
....._..,.
!: '>'-
,J
t
~
~
f \
,.. !;1? -
))
-
..<)
?
r._,
~'r ':f ...,
\I
d r'l>' ~
) ')
\
/
_,
-- r
r
~ v)r- ~ \v ~
lr
71
-?I 7\
~
......
~" 1="1 j c-
~ ,--,
L \,....
~,.,.
J1 v~'f'
el- V"l
t
( ,·
--- ~ ~~
ifL
'---"'
v
'!
.l
Wvo ~ le'"" \-A
\....\<:>. '"' \
......
::. I. 'V\.'Z\)l?-)..) h~-r~)
\C'-"1
f\~ \: \ \.A)~
vl. \... ).._v\- ~ ~J'. )k~l>£\.X--\ '\ )_)
\~ \. ~~ \ \ <- ~ )L \ } - ) "?)
~-, J\-\ . - .,. , ~ )
~~
htv(.
~\- ~
;::.. \ t;, e .
""---'~"'> '""k). .,
~
lA\e:-~
./
"' l-
~ 'p..
Ln <.; :: \ v~ \t- \
~: tl- h)
ll: l """'
t y~ \-\. l y~ '9. -.ll-
v:'t e- \_ v-'<: ~--'- )
li ~ e-tv-\\
e_- ~ }-.\'1.. l \- }_ )v
\1
~
_}.
e.
(b 1 l_d lJ) l~~f \1..\t"-"-\--\ CJ\ ...-'H. 1f"C'U..~ '=>') -._ ~C!.Iu'\o'-1 tV') 1\..._CC(.~')
v."'" "' .k\I'\ I L v ,la 1\.( "'") ~ ~ u-.Q.. cS\.1\&...1. L~" ( .
.t
\..._LJ;~.a.(!..-lt
T~ ~ ..lo.N<t c.. [\e.yL-.c.<. 9\ '-'-;...2-So-o A bev~o~( -tV\~
\o"\ ~ ~'( o~~<.~t.-\ ) 0\ f\."'-C.CU., t"\ ~ ~ ..\;v,.J. lUI V;:. () . ~J._ .
'}..-; ~ "" f\ . . ._ c.c dAJ\ u. \.A'\ ~ ~tv \0...--t..l t -\vt..J.A "' F.> I"\ l '}Jc.':- o. oo 1.-)
0 . ~ s-'i '
....,..
~ £.l)l") :: ~ &..'"'
t (, ~b
_\.lc\
-;- J -e--y_ J..~
_, -~
(\ )...<...O
)- ~-.:>
-\
\ ~) 0
~
~
"' I
'\
'''[sr_)
Q
~ .... , ~ x (. .i ) : L: 1\X~,~-~-' /
);:.,
~
- L )/1
~\ }-\ ~)( ~ \M\"-U.'"' ~ l -1')
- .)
:::)
\~
~ ~-
) <.o
:>'\ r 1 t1\-; --\ ~ S. "':) < I ~
\ rv\l\~\)
I ~ ';1 ~\
\ ,,_.,
1'1 vo ~ ~ ""' l. ~ J T~ ~. ~ C-vv v'\ \ "»"'- .\l~ -+:- ~.c).. ~ .
""'
t'\ ~) ::
2_ """ ~ • .... \, . t
~\\l)-;
4:) 0.
~)
/
. .. ,. .... .
. '' ~ (.C.)
rl ~\; ' ~ ~
...r ....,. " w . \
--
\
\ ~(.,.\)\"'
V.\\....)-:. ~ l \~- -·
"" ..lr\v-tA
~\) ~
~........,
:-
'
/
~)...e ~ov
J
~0\.0 ~ ).. ~a ~
"l: 01 1. ~ ~ c... f\f-\J'A-v--.c...( ~ Y...,'\"'? \ u-. '~ 1\ t~ .... l
f'v..v-. ~~ ~\....,
...._...,...,
'Y"' -;. }. • T~.;, ,. .,_
f t,)- \ ~ ~ \ ~1.-1 y"' )
1-.A-.
~ .., ~ ..,
.s J <J> \h)~ V,L) c1,J~ ~ A,_ -\ ~ j j r~ ~ \J...,')-(\\.l..~ \~a.)-11
- . ., -~ - .., - ..., -., - ~ q \;-,.) ~v'\.' ,!. ~
., .., .,
;.. ~ j ~ \ ~ \ ci,)..,)
- ~
l J ~ ~ "~ \ J.~1.. ) ~
-,
J. lj
--:.
l ~ ~\-":};< ) 4> ( "''\ c\>--, )
1.. ~\~)-\ ?-~ ~ ).. ; ~, ..)...,.., cl...
J.
~ l l -,J~~ ) ~~ ..1..
"
\_)'..l ) - \ ?-
:: \ --t j -'" eM- )
-, ).. ~ \ ""\ 4.A 1. ?- ck..tor
-;.- \
\ . J.. ~ •
~ l~- "-'\ ~
~\)..
\'\1-
•
\ ~ Vb \., Uv. '1.-"T \ \>t. ¥. ~<> ~ --\1.-.-.:A ~ "'""\~ ~,_. \ ,....., \-.\.. ( ) ~ 4.-VCI'-) 9-\~v c......-\\ "'.di ,.....,
~ '\\ ~ '\1... t_..u) W....<"C..V\od(_ '-'•w.._..J J...~fi.-\.J\\.,..Jd\)'1
\)r.v~\)";.
\Jo.v\~\-;.
~ ~
C....'-'\.~~)-,.. CA., ~~tr""\ ~ t O.\O.."i-t<lL.CII})v-\ cr-l-(.> -tt.\La_'"\ crt.-;..() tit'-))
~ \.
T~ t_""\_. :t \ N \-.\1...l ~~ ~1..._.., \,/ '\v_..., Q )
~ \. ~ t.. ~ '- \\ \X-; l.. \ ....._ ~ l \\- (,.. '-S" ) _ ~\~ ) -;.. ~\ \. <i' 1S\.. 0 l- O·~i"~S" J
~{, ~"
~ ~\\ · ?"'>..-\ ~l·~\-( ~ · 1-l-l..- ·
l~)
\ ~vo,~e-l.-"\ \
- l ~) {'ol '= \\J ~ -': l- v ?- t "'.s. \ ;-\."' ) )(
1.- "t l ~-\1- 4-\\. )'\ t'V 1--1 1 (AI\ l~ ....')"'N.._)
~\~v G..~V...d.l,..,. ~ l ~ '\
::=.\ l l>_.~ V ) rv N \.. r-) ~ "'< {. \ ~.,.. 1 )
\cd<..l'"'"'
r"V
~?-,~
\-.\1..l
* -(
L- "' a...,,
U
....-4t...
0.14 V
~.....J..~ \-~~w.l\ .
,,
.
"
l\c:tv'\""....1 e\
'/. ,rJ \..\ l~_ \ )
"Q"'"-.J cal ...,_ N~\l~\)
r'L "
''"'\~'..,"a.'....,~).\~~~: 0 ( t'\ l ""'' , 1.) N N 1. \ ~ • \ \ - ( )
J --- _,
?\ )(r-1Nl~\) .
\ rV ~ l~ \ )
{) ~\1\o~ ~ ) ( :><..... \ ) rf' h\1- U.\-. ~ ~M ~ ~o.
~----
\Jt'o'> ~ ""'~ l~) St"'-Ul '1-.. '1 .....
C'-' ~ L~. . ')_, E l>t- ) ~ ~ l"'"\ \ ~~->
., ~
-....
?-\ G,...; \. ":. ""' ) ~ ~ ~;I(~) - £ ~ \ ~ \"-\ \ ;:.. c:::. 7) c.,..v -.I l ~..,) ;;. -o
l~) Ob"'\.:..~}..) 'f... ~ '\ C,.vc.. '-~~ \.. _ L..'f .u.--le---±- ~ .l c,.l\ ~;::.\::)
)(~-, .)..." I )(s -,. 1-) :::. ~ ~ ~?,._.,) · - • ' ~~~ >-:_
~ ~f. - l":~ l)(r:::.L)
- 0' \.2.'- l2.> ~'\U:
~
~
{).~ '1. '-\ ;:... ~b- L. .X v
(~
l~) -;h ~ A ' v-... \A(. 1-<A A. f..u.._ q \ 2.. V..... :} \ -. . . .k.\ t..,...J~ ~
t)~ (J.. y dV ~ ~~ \...(
-=\ , , ~
-- V\\..
::.\ '\
c),_
-;:-
-:b.(
\h\
,....;~\
'-
f\.-
'-c. ) 1....7- \
~
VJ-J..
-
'l..~l..
e
(}.VC- (. ~ . !
~
""'":)... "''--t"' \..
' \ ?- ~ c,..-A '\ \. :. .!2_ (\vc.
\ ~ .,C..(~l<'\.-
~e.·~
~ /
-
2- \
C)
"' j
D.-.
"'("'
..l- 0 . ,
"I" ...-\.
~ ~
"'--"\
.-
t~ .:
\ ;,
l\~ --- 'b .... ) ~ \"' e. ~ J 'J> - / 0
J ..-
"- - 1 ..._ .,}
~"' b
-.,.
v
t_ t.t~()
l "/! "' -C.: ~ \
L '-lo.v \ ~"-· )
l 'l-{ l h_.: ~ \)"'
'
c::-<
., \
b.: ) \. "1., 1\ c-.. \"-.~~t..J ~
-..... '\1 "'" \.
L \,";-( ~\
...r
\ -- \/ ....., l '/.v., ..,.)
~'- <;:_ \_h-.:.-4: I)).
)
\.1;(
\'Sv,J\'S"" .
- - - <:)
ESO 209: Probability and Statistics
2019-2020-II Semester
Assignment No. 7
1. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is unknown, and let the estimand be g(θ). In each of the follow-
ing situations, find the M.M.E. and the M.L.E.. Also verify if they are consistent
estimators of g(θ).
(a) f (x|θ) = θ(1 − θ)x−1 , if x = 1, 2, . . ., and = 0, otherwise; Θ = (0, 1); g(θ) = θ.
(b) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
(c) f (x|θ) = θ1 , if x = 1, = 1−θ 1
θ2 −1
, if x = 2, 3, . . . , θ2 , and = 0, otherwise; θ = (θ1 , θ2 );
Θ = {(z1 , z2 ) : 0 < z1 < 1, z2 ∈ {2, 3, . . .}}; g(θ) = (θ1 , θ2 ).
(d) f (x|θ) = K(θ)xθ (1 − x), if 0 ≤ x ≤ 1, and = 0, otherwise; Θ = (−1, ∞);
g(θ) = θ; here K(θ) is the normalizing factor.
(e) X1 ∼ Gamma(α, µ); θ = (α, µ); Θ = (0, ∞) × (0, ∞); g(θ) = (α, µ).
√
(f) f (x|θ) = (σ 2π)−1 x−1 exp(− 2σ1 2 (ln x − µ)2 ), if x > 0, and = 0, otherwise;
θ = (µ, σ); Θ = (−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(g) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = Pθ (X1 ≤ 1).
(h) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = Pθ (X1 + X2 + X3 = 0).
(i) X1 ∼ U (− 2θ , 2θ ); Θ = (0, ∞); g(θ) = (1 + θ)−1 .
µ2
(j) X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞) × (0, ∞); g(θ) = σ2
.
(k) f (x|θ) = σ −1 exp(− x−µ
σ
), if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(l) X1 ∼ U (θ1 , θ2 ); θ = (θ1 , θ2 ); Θ = {(z1 , z2 ) : −∞ < z1 < z2 < ∞}; g(θ) = (θ1 , θ2 ).
2. Suppose a randomly selected sample of size five from the distribution having p.m.f.
given in Problem 1 (a) gives the following data: x1 = 2, x2 = 7, x3 = 6, x4 = 5
and x5 = 9. Based on this data compute the m.l.e. of Pθ (X1 ≥ 4).
3. The lifetimes of a brand of a component are assumed to be exponentially distributed
with mean (in hours) θ, where θ ∈ Θ = (0, ∞) is unknown. Ten of these compo-
nents were independently put in test. The only data recorded were the number of
components that had failed in less than 100 hours versus the number that had not
failed. It was found that three had failed before 100 hours. What is the m.l.e. of θ?
1
4. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is an unknown parameter. In each of the following situations,
find the M.L.E. of θ and verify if it is a consistent estimator of θ.
(a) X1 ∼ N (θ, 1), Θ = [0, ∞). (b) X1 ∼ Bin(1, θ), Θ = [ 41 , 34 ].
5. Let X1 , . . . , Xn be a random sample from a distribution having mean µ and finite
variance σ 2 . Show that X and S 2 are unbiased estimators of µ and σ 2 , respectively.
6. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X).
(a) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(b) n ≥ 2, X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞); g(θ) = µ + σ.
(c) Same as (b) with g(θ) = σµ .
(d) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
7. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X). Also compare the m.s.e.s of δM and δU .
(a) f (x|θ) = e−(x−θ) , if x > θ, and = 0, otherwise; Θ = (−∞, ∞); g(θ) = θ.
x−µ
(b) n ≥ 2, f (x|θ) = σ1 e− σ , if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = µ.
(c) Same as (b) with g(θ) = σ.
(d) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θ.
(e) X1 ∼ U (0, θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(f) X1 ∼ N (θ, 1); Θ = (−∞, ∞); g(θ) = θ2 .
8. Let X1 , X2 be a random sample from a distribution having p.d.f. (or p.m.f.) f (·|θ),
where θ ∈ Θ is unknown, and let the estimand be g(θ). Show that given any unbiased
estimator, say δ(X), which is not permutation symmetric (i.e., Pθ (δ(X1 , X2 ) =
δ(X2 , X1 )) < 1, for some θ ∈ Θ), there exists a permutation symmetric and unbiased
estimator δU (X) which is better than δ(·). Can you extend this result to the case
when we have a random sample consisting of n (≥ 2) observations.
9. Consider a single observation X from a distribution having p.m.f. f (x|θ) = θ, if
x = −1, = (1 − θ)2 θx , if x = 0, 1, 2, . . ., and = 0, otherwise, where θ ∈ Θ = (0, 1) is
an unknown parameter. Determine all unbiased estimators of θ.
2
10. Let X1 , . . . , Xn (n ≥ 2) be a random sample from a distribution having p.d.f.
1 − (x−µ)
(
f (x|θ) = σ
e σ , if x > µ
0, otherwise,
µ, which are based on the M.L.E. and belong to the class D = {δα (X) : δc (X) =
X(1) − cT, c > 0}, find the estimator having the smallest m.s.e., at each parametric
point.
3
\
L t b.\,
(tC-'\
LL*' I t\ o{ I cn
F. u.-v z Lr:u : i. !.'r''- .
!-r.') : \ \ ='^ ' '
1I (: vr
I ,,r 1., 1v rr'-a) gv r "l
r!
'- I i ai oa " ' t
-' ir .}\ r..:/,
\.N{.i.ia'/ f'.*'
>-^,
'r" rrel 't\^i-1fr\4 ' ^,- i
t"u.L' - \ l +l i
h\ Lr.r
v\ lr-or). \.:-L"1S$r'
\t S:L':)' GL*r\ jz
-\-,!\- "
A\Y\\--
q- }-
L
ti.' \-1Y!!:'), -
* i]'/ G--' *:f: L
; u"--
ilfl,-.r tV.,t-9 l
1.ifl'-r :
\'^-,.r., c*- *.
qYi'L
tr *^"ca
\.\* x- ]+J
\ '*T <*\Jv
1-
t.
td\ K\U \"- \\- r\ t \.r :. ) $.',uLI \ J L'rq - flltz)
=
, Ls't^o^g
-i-- 1 i*-r... LL
\ \r1
{7^rv
L-V\^\ \ ^- S-f A, = $"'\ b - 3 4.,-r
' / ,\
gr> -, 1- A .
-Eti
.t\ 11.'.o'oli'-.
G-ryi'l+€+tlr
ft s-,.11\1iA+
l'vout-i
' .\A , i\: |
*r-t{ + ,.1 i i'--
-' iL. '-
t-i" C(-l-. ;- i.
L^i 11 \ - .v li '/11- \- j
i+-
;
*
n N,\
., !,?- !,a* ' i !rt. St.t\ ,
Jl
J.
V\- \;',1'.o^3
1 a/ ',u
tA-
iAr-- rr
>)r\ .. x-
-1
,/
*.-ttnta; '
L\I'rr) -- \.
D -
\avi0,^a):-o-' '...-) tt*" )- L\- Stt',' €l- u-'- I
\o \i tt.- s.J B>*' cF C-*.^+.-i eA+ir^^-J--A ef h
o^i -L
t?) |ret; iq r.{r,,r =r-eli
-
f\.L.€ * t (^ 7 ,:t t\.L-€ "\ ?r\r L4 tnutr ): l-e-rT
I51 \,.r.L.!- u 1 -19 S 6yJ f,t*'-r 1tUr ,.,r 6. (3u-}rar'\'o'^'T {t^-.Jti+' tlr$
bi\L i'i
C-. l-
\i \; -'j3 a-'1;;*-h^) )
-\
=2 \-aC ; flr* ) =9 Lr.,,-,.L\ -, G*1"^.\cJ \ov -Q\S
Bgt*.) 2 s . $o
\i) trxlu\: i t-,
I
tf b) 2T
, \F{^o--r -1-= 1p-c'. L'iYr\., - - '- l*-l )
O' \'i
I
Sc 1r.'- -€ . v{. g \-,t\ 1-T -:) \-.- L- E- "1,, lrs ,
) tr-t s 1-\ .n ln' t- I U 111 zr ) .
i ;- i : T-
-4 \t.'-,\t; f* .u,.='r
' .*-r;i ;" r-.-drr\r:j 'n"\' t !r,,'',.\Lt " (^'*rr^le-i
-i J'
\L) -*. [-n^],.-^ ii*i-')
\?r --; . tA 1*trr a-)a
\\u..-'l= ["Ldth,'a')= ]
\-'
\ u o.L.j-
^7s
L LI! - ^L\'\ I /
\zt
\r^ f- 1r"s
!
X ---t p\-\'-- --\ \..nr^ -5 h- o - *I7 - }r.
.)P.11
{ -fr V lr.\'- o-\
d ,* r. -f- c- t rnx CJ \t""o- d Gt'\'l i )"*J \t" A a*\
=
c- ) -va1\z-.htvd; .
i\
tL '.r t i1^',, -Sri; >t_ ') = iq \ t"-/t*1:> i): rlL*c")<s-t-) =T l)$rr*,.<
\._e \
L:l
; - r -L !
'
-ii
9"ro-! \ ): \^r \^ Lx;.iA*<-J, bt $L.
Etxt>
u' i;
911\1
'-'-:- / L!,.1I ' $l- ui-s sr$:- {0..S.
L
\-0,\,
? a \-'
>\.6 ]- . a\
\.in r: - ri > ',l1\1-i51 1
Srrrtr =
l\
F\-+ t---
t)\Fr-Fr-.1 =
i4 I'r=bQJ.* }. i
i"-t {^orva
\sc
e_ 'Y
.*t.t.d rg
b '.i^. S'? . s\ + ''^ ^-
n -\cl
-) v-
16 LC'-t
-l J
..
-v
t'("
, t .N \-
.n
tn Lt \!
\ -
J ' '. - t-g>-o
,l {-
;
.* 5- I -v,
4 Lx tB Ltr
zo
{ .1
" ! t'i
{
L>r-,
,t1 i
r b*!t a+i- i,., '1,^4ii'-ia€-l ^A il.--,r
4_V-U.
i aAO t1. \
/\I It -
.t
i -, lL'r
'-x i'- t f To aJ in \rnq.hi.qi ?.rl J $: g.-.
^ T *'' ',r- ! x;
t Li-\i i itj. !* { 2H-
)
13
$- !" i$\- ,\] t--\
L-7
t' .'?
.i.
ll*t .-x iS J i i-l,, : 'Jn v l. n Lu, j x
\-j^j'{ i v: rr La- 1
L.A(\jf A jr t/l\
3 L'c" \{'-i
!- i-S. \ '.n ?-t-i *
LA
\nu1!\: \ J,. )
d
\: ,-
Y!.I
1
/-
h-.-
ttsJ
1
1)1 \x.L.1". sr ? J-r. p =3r' lY{xi ? (+
\e { ossa)"r
1$,
t- 1/.r
{ +ro'=/, { A,n*
\ .[ '- ).,,,1
,
.'. -.! ? li
\\$i +d^ Lr.pT\ a< q vv) 4u**ti: - }t .\!
i:*-*1 !n,,iL,' :--; L ?CI;
";
" '- i-r G.^ar4;s-3
t,\.:4_\l :.r \
1
\?-
=L'1 +/ d*, I
o-".+i,.***ii"^; S {t+€
- \ q- -^ ' l
*9 t
€^ \
J-
h-r
ilr,-u, Lzl ),t
1r\ {-
vl*
t
r
j- :Ly.l. -. -Lx +)
!- r
\- .Jg 'tj- i > '\ E- .-L - ?^'u
'Js, 1l \.-, t^L f, 'v'\' \?1 ,.^
.'-i' f- r-i\'
>i ++ ix ) t- ''s . * - ;Ysi>'' ;
-E*l -1 r:^
L! L
Lr 1 >-
1- v--- ,'"4
)?
-
i-:
\4$\5, r- 1
i -- -t
-v :t
fTr-:------- L's_ ^l
\ A rs1 tr' \t' i.!):
or-
!\ r ':--, -)
!fveb$,t.^^( 1 fat
i:- ', L.€2 {.t q tl
t\. So \-a' v\.
\r
5-v Lft'^ua! 1 b'i
\Fi^ r.^{ \ -- a\i
an
'r
Eu\T" I : \rir-r l' I 'n+Y-) ) V
-lL\ r,
v\
b. t :rr-rl- ' \-'**v-'. ) i
t.,r\L\; -__L
"-,,^
\a1n-r)" lx'rvr')
l' ) !u tr' u'€' "i |igi -r
t ^,*12 \-t
\
-)_
;.
J / "n-t -) rst L\ ' A
/-tt- -
\-\ \ \^-)
,IG
-\ Ls\- - -\ \-
.G f-o- I
i + \- ..?
>) t"\l\-- ll
.i 1-
(;l'u
Lc) t\.u,e,\ b * "q. Soi,r\.u.€. d 0rrl -l, t*.\-I) : e_T=
\-ri..,.-r T-- T l-' o.r ]ltM-,* \.t^\) -
"+
7\
A l- 1"i\ ) ? \'n+)
*-.....-.._...:....'_
-
o\
v V t?s
2t-'
/b \. = -
)z-
v1 . 1.rar ) t
I '\ \ h
L n:r:-
gi
a
+ w)i u)
A! .t
)'t- 0 V $ -;rc
)--> a\
j^-i );-. /\c-rt<a \s'a L'+r'S 0'J 'R'N'S') U^'$U^ y $)$- -Vt!0
Sr** \'wo^
At-n'cl cFe A<*4- L' A -
LJ$rvrc^). I "\ $j i^^ ')ur' \ou<"
)^ril'nj 2 q.)j ., r.a-!L.-
a l x\.
,}zll
2\ l^r.\ I: 't
"- t)\ ' air-i-t^J
; L''?;
- {lr-<-t-
,i^\I / it-ti1 h /
=a
V tLi-a-)
Eq\*i'l--^) -3-
Fz.-1v.-\r $oL v-$&l-T-a-r -\
-?\ l./al\'r I / - A t
- +LLs\{"'-*)."i 'i-o\,\
I .. \ 'Xn** !.r.
\',,\ b'Jtr\v
T*, i'- r*u.,^" o\ n'y E'
r !u Fr t2' +
\ \' v' ^.
'\' \tt'''- i . F .\
L'Ltrt-!t"'r l S. J^c
U) l\.L'e' 4 9; \\-)
lgr \'
ti} r ..A tn\Y \ > ^ (\- rJ,.r^lr; t At'
tr" il12-'^
-x^4
\1r*\.- *>+
-l
u-/
L^
rtsr.\!\- l\!r1t \ . Ly\ \*
-'-\ l-uu \c,-"-) )
o\
t).\ tr. u. e, d\ |rs) I Er tll; \f _
4-
I
qe€
L ^\a
''!
= l-'t-
rt S-r i-: ::lt €-u \a*): L *
-?t ts\l\ > \0.-\!\ :- 3"
!e i Tr.u. C' dr g i'n X\^r - So
i t't- i - ( Y-
'
'1l{1*1 \lt .'' a-^ v
1
Io
^r'\-'^.$t-
rzgL},,-r ) C /
V
:-\ srr\.L); Y d\-J S^\\
*\^\ -u
Yr--.
) i
v _Lr"
\t 'f .L
1 en |t*
l\ 1.r\\ \ - Yrlu\\ \; Et L $r*r
1-v\ 2bt\\-1*l €.u\{]r )
1 )-'
F. \.n.rv\
\\- \.-- 1trY\11^1
) -
\a
.n \L
-! ' Y\ f" - zB'I.r- { )
\.-\tt*vt----' ) -tt '
= 'it- ) Y.ni-t
'^"
'i r'f
^v' .r.^YlY N-
)-Yb ?
'-+ -'-111{ 4 v. \-l -" '
1 --Y
-r-, *_v) >a { h?Y, =c- :
: )----a;;;f,1
- ' ) {'".--
Lo
\r.,\.1 -\\ J I
r\\ \1.r-,c-'\ _i e.
c $ T. So tr..'_€. d\o'
l- L -r -,e,
-ur-
i_
v s
\. S- -= -L
Zs\'-rL--n'
-\) \'
t^\at-
s\- . r \- €v\t{.-s-) ) - -
t . -'l-'-1
' bn'\\ \
\\co{' lB \-
\
! L*\x-'s'1
J- }- )o
- V1
qq J'ozr -\A'-0 \A' A'€' ' Cvl{<'ri- -
Su u\ \a;{u. l'L"- tr^
rf g i.*y*-)-r t iX:--*r)\ Tl- t';\'/'rtr"): t''t\{'-x-)'v}}1'
D"\r'"e
lu\r.-/'-)> >u\.
\-Yi- t') 4 l*:' tr') :\ Zs l) Lrt-:'r 1 t
I,' r I, -rt, G Too-i'-'- ;\'q'"^\r!-e =+
\f !€-'t8'
L s Lb\'. l,) ') > ?\))-
T*-
ZU i!r\;, t,r ) ; fl\)l- \ q L€ '
Nlr"
.\ rF
VO [lr;\x.-t") ) =
\
v sao i\** t, i,- rr. ) A I Y,- x. i
Ul LL'r/:',, ) ,
1
)!$ r
i
C\{,vl-
or^'^!, 9r$,)
v\"-
g \r"_x-l - ?tq I \> \
---=.-- I
'.i
dtt\\l.Y'i d';q 151;'-*'t
, \) !^
n'ut-'J'
:t> o
etL
g=) L $trru\"; [t-;\-rr)$ LFtf- V 04\ 4 \
1:o
1l*., b" \tJ \ e€. \r 5!r9r :
uA trni rrrrrrrr ?:--<--( J
r'-'
L.
Lq \ {1r1-M1 Lt, r T'a v-{ '- \ t
.-\
-taS rT
I' I -*1^-t \ n ''
\1
'\l \ni C; b.l '> C2 Aot .
Tv, , \* S"8 @, f t"\\\ "r\'-i Z<J' J ra+.i
"-ttoa1
\t-':lpM \t" c{e- . \g-t 1 )
'- '9! xr- r .Q^-a -Yr
T,..-,1 11.'^^r *\ -)t*0 E-l P
o
tx\nr-*t.,t-l\ { Xtr)-tr--o )t I
"s\e \f I - }"' Lg\ [{'1r\- {1r\-r) tl"''r'!-til
: o{t Zg \ ',3r,.r - )Li-rr-
T gs \ (-{.',-t"-e )LJ
\1trr\t )-- ao( Eu \ L*t^r -ao)- rf lt ) Ls\ ({t.r-*tr'-\l
LX69r-t}l
*-
?\ t\cD"' ls\a L€s\l-d1*l-iL,l-t)tl )o'
?*u .n
\^ \,".,rirr.\r^r
.}rgJ J
T\ \q4 1 .. - !rtr.: L k6. il1*\QX
{row 1'
,ln\
-1
?e { i i- Xrui-'t xc,\ ) L dir rt a-o ' )
C. 1
Lu\ LXt*r-/s-.t-\ 'L) -.)
\r\n-r \ t1- l
. -!A- L
)
(*'-L
1
\u
\-E
n\n-r ) l)-t )t-t A: ol:r
t-i 4) d1 4q<
LL
-
- \^t,r-.\ A\ 1t-+)h-t;n*-'
-t \ { }+\ -e
O4A<l4i '!-'1. x *)
\^1 r^-r- \) I\ll \,.
i1
x'r- -.' /I
+ )
C= '-
*1,*1+ *11
_\
/| Fq^.*t -V^-l qr\-l-,.'*s\on.,) \-\ *4 Cl-a,-l I \oio l'x\ : 1
1^"^ 4r(
A,^Jl,Utl \A,^.q_ a.* R-s<-- \^uo*J^.-. \o,^+ -
T-l- .T
L-
i;i
r-rK h-.^
E * Lfa"
Zs L?-' ;
_Y iw
r;-- .t'
t--' $
-''\.t'Y'
i-v
\*r L'l S
r \gr
-r*r",
!\ .,\
L
1
, ^.,-..:
!-' '/ '
,- 1*a
,tt \.jvrs -r V/\
_ \'^<
a^ "
L=vl 'o*> ;4\d,*J
-- .Y (-
r-1' -n
tl
'/9j
!.*, l-* Lfur,.,. U+ € n1\\i 1\ \J \i- .i'*i e<-\ *J - {-i l
^TL_/
o.\)\u-5;n l4-\1- r
^y\
\r {--:- n \,ra.A.r--
- l\Tr/ ll X \_ v,.d ;a.r 4,"^-$&;\-
'\'1
Y-
\ h"l' t \ h<Lv
1t
{'2a* \}Gv<'.*4J'
\r' \-r**.
)
^!
- \ c.J\ V\-?--
)Lru..l
\b=, \') u, L, u. rn T= tg -
\- RJ . t, t .r \
'J, (-Yn'1xJ
{' r<\
\t-\, ) ---'-,
)
v
-\ o\-D'\i /
ESO 209: Probability and Statistics
2019-2020-II Semester
Assignment No. 7
1. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is unknown, and let the estimand be g(θ). In each of the follow-
ing situations, find the M.M.E. and the M.L.E.. Also verify if they are consistent
estimators of g(θ).
(a) f (x|θ) = θ(1 − θ)x−1 , if x = 1, 2, . . ., and = 0, otherwise; Θ = (0, 1); g(θ) = θ.
(b) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
(c) f (x|θ) = θ1 , if x = 1, = 1−θ 1
θ2 −1
, if x = 2, 3, . . . , θ2 , and = 0, otherwise; θ = (θ1 , θ2 );
Θ = {(z1 , z2 ) : 0 < z1 < 1, z2 ∈ {2, 3, . . .}}; g(θ) = (θ1 , θ2 ).
(d) f (x|θ) = K(θ)xθ (1 − x), if 0 ≤ x ≤ 1, and = 0, otherwise; Θ = (−1, ∞);
g(θ) = θ; here K(θ) is the normalizing factor.
(e) X1 ∼ Gamma(α, µ); θ = (α, µ); Θ = (0, ∞) × (0, ∞); g(θ) = (α, µ).
√
(f) f (x|θ) = (σ 2π)−1 x−1 exp(− 2σ1 2 (ln x − µ)2 ), if x > 0, and = 0, otherwise;
θ = (µ, σ); Θ = (−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(g) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = Pθ (X1 ≤ 1).
(h) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = Pθ (X1 + X2 + X3 = 0).
(i) X1 ∼ U (− 2θ , 2θ ); Θ = (0, ∞); g(θ) = (1 + θ)−1 .
µ2
(j) X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞) × (0, ∞); g(θ) = σ2
.
(k) f (x|θ) = σ −1 exp(− x−µ
σ
), if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(l) X1 ∼ U (θ1 , θ2 ); θ = (θ1 , θ2 ); Θ = {(z1 , z2 ) : −∞ < z1 < z2 < ∞}; g(θ) = (θ1 , θ2 ).
2. Suppose a randomly selected sample of size five from the distribution having p.m.f.
given in Problem 1 (a) gives the following data: x1 = 2, x2 = 7, x3 = 6, x4 = 5
and x5 = 9. Based on this data compute the m.l.e. of Pθ (X1 ≥ 4).
3. The lifetimes of a brand of a component are assumed to be exponentially distributed
with mean (in hours) θ, where θ ∈ Θ = (0, ∞) is unknown. Ten of these compo-
nents were independently put in test. The only data recorded were the number of
components that had failed in less than 100 hours versus the number that had not
failed. It was found that three had failed before 100 hours. What is the m.l.e. of θ?
1
4. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is an unknown parameter. In each of the following situations,
find the M.L.E. of θ and verify if it is a consistent estimator of θ.
(a) X1 ∼ N (θ, 1), Θ = [0, ∞). (b) X1 ∼ Bin(1, θ), Θ = [ 41 , 34 ].
5. Let X1 , . . . , Xn be a random sample from a distribution having mean µ and finite
variance σ 2 . Show that X and S 2 are unbiased estimators of µ and σ 2 , respectively.
6. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X).
(a) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(b) n ≥ 2, X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞); g(θ) = µ + σ.
(c) Same as (b) with g(θ) = σµ .
(d) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
7. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X). Also compare the m.s.e.s of δM and δU .
(a) f (x|θ) = e−(x−θ) , if x > θ, and = 0, otherwise; Θ = (−∞, ∞); g(θ) = θ.
x−µ
(b) n ≥ 2, f (x|θ) = σ1 e− σ , if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = µ.
(c) Same as (b) with g(θ) = σ.
(d) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θ.
(e) X1 ∼ U (0, θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(f) X1 ∼ N (θ, 1); Θ = (−∞, ∞); g(θ) = θ2 .
8. Let X1 , X2 be a random sample from a distribution having p.d.f. (or p.m.f.) f (·|θ),
where θ ∈ Θ is unknown, and let the estimand be g(θ). Show that given any unbiased
estimator, say δ(X), which is not permutation symmetric (i.e., Pθ (δ(X1 , X2 ) =
δ(X2 , X1 )) < 1, for some θ ∈ Θ), there exists a permutation symmetric and unbiased
estimator δU (X) which is better than δ(·). Can you extend this result to the case
when we have a random sample consisting of n (≥ 2) observations.
9. Consider a single observation X from a distribution having p.m.f. f (x|θ) = θ, if
x = −1, = (1 − θ)2 θx , if x = 0, 1, 2, . . ., and = 0, otherwise, where θ ∈ Θ = (0, 1) is
an unknown parameter. Determine all unbiased estimators of θ.
2
10. Let X1 , . . . , Xn (n ≥ 2) be a random sample from a distribution having p.d.f.
1 − (x−µ)
(
f (x|θ) = σ
e σ , if x > µ
0, otherwise,
µ, which are based on the M.L.E. and belong to the class D = {δα (X) : δc (X) =
X(1) − cT, c > 0}, find the estimator having the smallest m.s.e., at each parametric
point.
3
MSO201a: Probability and Statistics
2019-20-II Semester
Assignment No. 6
Instructor: Neeraj Misra
1. Let X1 , X2 , . . . be a sequence of r.v.s, such that Xn , n = 1, 2, . . ., has the d.f.:
Fn (x) = 0, if x < −n, = x+n 2n
, if −n ≤ x < n, and = 1, if x ≥ n. Does Fn (·)
converge to a d.f., as n → ∞?
P P
4. (a) If Xn → a and Xn → b, then show that a = b.
(b) Let a and r > 0 be real numbers. If E(|Xn − a|r ) → 0, as n → ∞, then show
P
that Xn → a.
|X| r |X|r
tr 1+tr
5. (a) For r > 0 and t > 0, show that E( 1+|X| r) − 1+tr
≤ P (|X| ≥ t) ≤ tr
E( 1+|X| r ).
P |Xn | r
(b) Show that Xn → 0 ⇔ E( 1+|X n|
r ) → 0, for some r > 0.
P
6. (a) If {Xn }n≥1 are identically distributed and an → 0, then show that an Xn → 0.
P P P
(b) If Yn ≤ Xn ≤ Zn , n = 1, 2, . . ., Yn → a and Zn → a, then show that Xn → a.
P P
(c) If Xn → c and an → a, then show that, as n → ∞, Xn + an → c + a and
P
an Xn → ac.
(d) Let Xn = min(|Yn |, a), n = 1, 2, . . ., where a is a positive constant. Show that
P P
Xn → 0 ⇔ Yn → 0.
7. Let X1 , X2 , . . . be a sequence of i.i.d. r.v.s with mean µ and finite variance. Show
that:
2
Pn P
(a) n(n+1) i=1 iXi → µ;
6
Pn 2 P
(b) n(n+1)(2n+1) i=1 i Xi → µ.
1
P
9. (a) Let Xn ∼ Gamma( n1 , n), n = 1, 2, . . .. Show that Xn → 1.
d
(b) Let Xn ∼ N ( n1 , 1 − n1 ), n = 1, 2, . . .. Show that Xn → Z ∼ N (0, 1).
10. (a) Let f (x) = x12 , if 1 ≤ x < ∞, and = 0, elsewhere, be the p.d.f. of a r.v. X.
Consider the random sample of size 72 from the distribution having p.d.f. f (·).
Compute, approximately, the possibility that more than 50 of the items of the
random sample are less than 3.
(b)
P100Let X1 , X2 , . . . be a random sample from Poisson(3) distribution and let Y =
i=1 Xi . Find, approximately, P (100 ≤ Y ≤ 200).
(c) Let X ∼ Bin(25, 0.6). Find, approximately, P (10 ≤ X ≤ 16). What is the exact
value of this probability?
k
11. (a) Show that limn→∞ e−n Pnk=0 nk! = 21 .
P
(b) Show that limn→∞ 2−n rk=0 n n
k
= 12 , where rn is the largest integer ≤ n2 .
P P
12. (a) If Tn = max(|X1 |, . . . , |Xn |) → 0, as n → ∞, then show that X n → 0. Is the
P
conclusion true if only Sn = max(X1 , . . . , Xn ) → 0.
1
(b) If {Xn }n≥1 are i.i.d. U (0, 1) r.v.s. and Zn = ( ni=1 Xi ) n , n = 1, 2, . . .. Find a
Q
P
real α such that Zn → α.
14. Let X1 , X2 , . . . be a sequence of i.i.d. r.v.s having the common Cauchy p.d.f. f (x) =
1
· 1 , −∞ < x < ∞.
π 1+x2
(a) For any α ∈ (0, 1), show that Y = αX1 + (1 − α)X2 again has a Cauchy p.d.f.
f (·).
n 1
(b) Note that X n+1 = n+1 X n + n+1 Xn+1 and hence, using induction, conclude that
X n has the same distribution as X1 .
(c) Show that X n does not converge in probability to any constant. (Note that
E(X1 ) does not exist and hence the WLLN is not guaranteed).
Xn
15. Let Xn ∼ Poisson(4n), n = 1, 2, . . ., and let Yn = n
, n = 1, 2, . . ..
P √ P
(a) Show that Yn → 4; (b) Show that Yn2 + Yn → 18;
2 Y 2 +nY
n P
(c) Show that nnY n
n +n
2 → 16.
16. Let X n be the sample mean computed from a random sample of size n from a
distribution
√
with mean µ (−∞ < µ < ∞) and variance σ 2 (0 < σ < ∞). Let
Zn = n(Xσn −µ) .
P d 16Zn 2 d (4n+Yn )Zn d
4Zn
(a) If Yn → 4, show that: Yn
→ Z ∼ N (0, 1); Yn2
→ U ∼ χ21 ; and (nYn +Yn2 )
→
2
Z ∼ N (0, 1).
√ d
(b) If σ = 1 and µ > 0, show that: n(ln X n − ln µ) → V ∼ N (0, µ12 );
δ P
(c) Show that n (Xσn −µ) → 0, for any δ < 0.5.
√ 2 2 2
√ Find the 2asymptotic distributions of: (i) n(X n − µ ); (ii) n(X n − µ) and (iii)
(d)
n(X n − µ) .
17. Let X1 , X2 , . . . be i.i.d. r.v.s having Exp(θ) (θ > 0) distribution and let X n =
Pn
i=1 Xi
√ d
n
, n = 1, 2, . . .. Show that: n( X1n − 1θ ) → N (0, θ12 ), as n → ∞.
18. Let X1 , X2 , . . . be a sequence of i.i.d. U (0, 1) r.v.s. For the sequence of geometric
1 √ d
means Gn = ( ni=1 Xi ) n , n = 1, 2, . . ., show that n(Gn − 1e ) → N (0, σ 2 ), for some
Q
σ 2 > 0. Find σ 2 .
3
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-V
Instructor: Neeraj Misra
1. For µ ∈ R and λ > 0, let Xµ,λ be a random variable having the p.d.f.
(
1 − x−µ
λ
e λ , if x ≥ µ
fµ,σ (x) = .
0, otherwise
(a) Find Cr (µ, λ) = E((X − µ)r ), r ∈ {1, 2, . . .} and µ0r (µ, λ) = E(Xµ,λ
r
), r ∈ {1, 2};
(b) For p ∈ (0, 1), find the p-th quantile ξp ≡ ξp (µ, λ) of the distribution of Xµ,λ
(Fµ,λ (ξp ) = p, where Fµ,λ is the distribution function of Xµ,λ );
(c) Find the lower quartile q1 (µ, λ), the median m(µ, λ) and the upper quartile
q3 (µ, λ) of the distribution of Xµ,λ ;
(e) Find the standard deviation σ(µ, λ), the mean deviation about median MD(m(µ, λ)),
the inter-quartile range IQR(µ, λ), the quartile deviation (or semi-inter-quartile
range) QD(µ, λ), the coefficient of quartile deviation CQD(µ, λ) and the coefficient
of variation CV(µ, λ) of the distribution of Xµ,λ ;
(f) Find the coefficient of skewness β1 (µ, λ) and the Yule coefficient of skewness
β2 (µ, λ) of the distribution of Xµ,λ ;
(h) Based on values of measures of skewness and the kurtosis of the distribution of
Xµ,λ , comment on the shape of fµ,σ .
2. Let X be a random variable with p.d.f. fX (x) that is symmetric about µ (∈ R),
i.e., fX (x + µ) = fX (µ − x), ∀ x ∈ (−∞, ∞).
(a) If q1 , m and q3 are respectively the lower quartile, the median and the upper
quartile of the distribution of X then show that µ = m = q1 +q2
3
;
q1 +q3
(b) If E(X) is finite then show that E(X) = µ = m = 2
.
3. A fair dice is rolled six times independently. Find the probability that on two
occasions we get an upper face with 2 or 3 dots.
1
4. Let n (≥ 2) and r ∈ {1, 2, , n − 1} be fixed integers and let p ∈ (0, 1) be a fixed real
number. Using probabilistic arguments show that
n n−1
X n j n−j
X n−1 j n−1−j n−1 r
p (1 − p) − p (1 − p) = p (1 − p)n−r .
j=r
j j=r
j r
5. Let X ∼ Bin(n, p), where n is a positive integer and p ∈ (0, 1). Find the mode of
the distribution of X.
6. Eighteen balls are placed at random in seven boxes that are labeled B1 , . . . , B7 .
Find the probability that boxes with labels B1 , B2 and B3 all together contain six
balls.
7. Let T be a discrete type random variable with support ST = {0, 1, 2, . . .}. Show that
T has the lack of memory property if, and only if, T ∼ Ge(p), for some p ∈ (0, 1).
8. A person repeatedly rolls a fair dice independently until an upper face with two or
three dots is observed twice. Find the probability that the person would require
eights rolls to achieve this.
11. A mathematician carries at all times two match boxes, one in his left pocket and
one in his right pocket. To begin with each match box contains n matches. Each
2
time the mathematician needs a match he is equally likely to take it from either
pocket. Consider the moment when the mathematician for the first time discovers
that one of the match boxes is empty. Find the probability that at that moment
the other box contains exactly k matches, where k ∈ {0, 1, . . . , n}.
14. Urn Ui (i = 1, 2) contains Ni balls out of which ri are red and Ni − ri are black.
A sample of n (1 ≤ n ≤ N1 ) balls is chosen at random (without replacement) from
urn U1 and all the balls in the selected sample are transferred to urn U2 . After the
transfer two balls are drawn at random from the urn U2 . Find the probability that
both the balls drawn from urn U2 are red.
15. Let X ∼ Po(λ), where λ > 0. Find the mode of the distribution of X.
3
18. A person has to open a lock whose key is lost among a set of N keys. Assume that
out of these N keys only one can open the lock. To open the lock the person tries
keys one by one by choosing, at each attempt, one of the keys at random from the
unattempted keys. The unsuccessful keys are not considered for future attempts.
Let Y denote the number of attempts the person will have to make to open the lock.
Show that Y ∼ U ({1, 2, . . . , N }) and hence find the mean and the variance of the
r.v. Y .
19. Let a > 0 be a real constant. A point X is chosen at random on the interval (0, a)
(i.e., X ∼ U (0, a)).
(a) If Y denotes the area of equilateral triangle having sides of length X, find the
mean and variance of Y ;
(b) If the point X divides the interval (0, a) into subintervals I1 = (0, X) and
I2 = [X, a), find the probability that the larger of these two subintervals is at least
the double the size of the smaller subinterval.
20. Using a random observation U ∼ U (0, 1), describe a method to generate a random
observation X from the distribution having
(a) probability density function
e−|x|
f (x) = , −∞ < x < ∞;
2
21. Let X ∼ U (0, θ), where θ is a positive integer. Let Y = X − [X], where [x] is the
largest integer ≤ x. Show that Y ∼ U (0, 1).
22. Let U ∼ U (0, 1) and let X be the root of the equation 3t2 − 2t3 − U = 0. Show that
X has p.d.f. f (x) = 6x(1 − x), if 0 ≤ x ≤ 1, = 0, otherwise.
23. (Relation between gamma and Poisson probabilities) For t > 0, θ > 0 and
n ∈ {1, 2, . . .}, using integration by parts, show that
∞ n−1 − t t j
e θ( )
Z
1 − xθ n−1
X
θ
n
e x dx = ,
θ Γ(n) t j=0
j!
4
24. Let Y be a random variable of continuous type with FY (0) = 0, where FY (·) is the
distribution function of Y . Show that Y has the lack of memory property if, and
only if, Y ∼ Exp(θ), for some θ > 0.
27. Let X = (X1 , X2 )0 ∼ N2 (µ1 , µ2 , σ12 , σ22 , ρ) and, for constants a1 , a2 , a3 and a4 (ai 6= 0,
i = 1, 2, 3, 4, a1 a4 6= a2 a3 ), let Y = a1 X1 + a2 X2 and Z = a3 X1 + a4 X2 .
(a) Find the joint p.d.f. of (Y, Z);
(b) Find the marginal p.d.f.s. of Y and Z.
28. (a) Let (X, Y )0 ∼ N2 (5, 8, 16, 9, 0.6). Find P ({5 < Y < 11}|{X = 2}), P ({4 < X <
6}) and P ({7 < Y < 9});
(b) Let (X, Y )0 ∼ N2 (5, 10, 1, 25, ρ), where ρ > 0. If P ({4 < Y < 16}|{X = 5}) =
0.954, determine ρ.
5
31. Let X = (X1 , X2 )0 have the joint p.d.f.
(
1 − 12 (x2 +y 2 )
π
e , if xy > 0
f (x, y) = .
0, otherwise
Show that marginals of f (·, ·) are each N (0, 1) but f (·, ·) is not the p.d.f. of a
bivariate normal distribution.
32. Let fr (·, ·), −1 < r < 1, denote the pdf of N2 (0, 0, 1, 1, r) and, for a fixed ρ ∈ (−1, 1),
let the random variable (X, Y ) have the joint p.d.f.
1
gρ (x, y) = [fρ (x, y) + f−ρ (x, y)].
2
Show that X and Y are normally distributed but the distribution of (X, Y ) is not
bivariate normal.
(b) Consider 7 independent casts of a pair of fair dice. Find the probability that
once we get a sum of 12 and twice we get a sum of 8.
n2
35. (a) Let X ∼ Fn1 ,n2 . Show that Y = n2 +n 1X
∼ Be( n21 , n22 );
(b) Let Z1 and Z2 be i.i.d. N (0, 1) random variables. Show that |ZZ12 | ∼ t1 and
Z1
Z2
∼ t1 (Cauchy distribution);
(c) Let X1 and X2 be i.i.d. Exp(θ) random variables, where θ > 0. Show that
Z=X 1
X2
has a F -distribution;
(d) Let X1 , X2 and X3 be independent random variables with Xi ∼ χ2ni , i = 1, 2, 3.
X1 /n1
Show that X 2 /n2
and (X1 +XX23)/(n
/n3
1 +n2 )
are independent F -variables.
36. Let X1 , . . . , Xn be a random sample from Exp(θ) distribution, where θ > 0. Let
X1:n ≤ · · · ≤ Xn:n denote the order statistics of X1 , . . . , Xn . Define Z1 = nX1:n , Zi =
(n − i + 1)(Xi:n − Xi−1:n ), i = 2, . . . , n. Show that Z1 , . . . , Zn are independent and
identically distributed Exp(θ) random variables. Hence find the mean and variance
of Xr:n , r = 1, . . . , n. Also, for 1 ≤ r < s ≤ n, find Cov(Xr:n , Xs:n ).
6
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-IV
Instructor: Neeraj Misra
1. Let (
1, if x + 2y ≥ 1
F (x, y) = .
0, if x + 2y < 1
Does F (·, ·) define a d.f.?
2. Let (
0, if x < 0 or x + y < 1 or y < 0
F (x, y) = .
1, otherwise
Does F (·) define a d.f.?
(a) Verify that F is a d.f.; (b) Determine whether X is of discrete or continuous; (c)
Find the marginal distribution functions of X1 and X2 ; (d) Find P ( 21 ≤ X1 ≤ 1, 14 <
X2 < 12 ), P (X1 = 1) and P (X1 ≥ 32 , X2 < 14 ); (e) Are X1 and X2 independent?
1
5. Let the r.v. X = (X1 , X2 )0 have the joint p.m.f.
(
c(x1 + 2x2 ), if x1 = 1, 2, x2 = 1, 2
fX (x1 , x2 ) = ,
0, otherwise
where c is a real constant. (a) Find the constant c; (b) Find marginal p.m.f.s of
X1 and X2 ; (c) Find conditional variance of X2 given X1 = x1 , x1 = 1, 2; (d) Find
P (X1 < X32 ), P (X1 = X2 ), P (X1 ≥ X22 ) and P (X1 + X2 ≤ 3); (e) Find ρ(X1 , X2 );
(f) Are X1 and X2 independent?
where c is a real constant. (a) Find the constant c; (b) Find marginal p.m.f.s of
X1 and X2 ; (c) Find conditional variance of X2 given X1 = 1; (d) Find P (X1 >
X2 ), P (X1 = X2 ), P (X1 < 32 X2 ) and P (X1 + X2 ≥ 3); (e) Find ρ(X1 , X2 ); (f) Are
X1 and X2 independent?
where c : (0, 1) → R is a given function. (a) Determine c(x), 0 < x < 1; (b) Find
marginal p.d.f. of Y ; (c) Find the conditional variance of X given Y = y, y ∈ (0, 1);
(d) Find P (X < Y2 ), P (X + Y ≥ 43 ) and P (X = 2Y ); (e) Find ρ(X, Y ); (f) Are X
and Y independent?
where c is a real constant. (a) Find the value of constant c; (b) Find marginal p.d.f.
of X2 ; (c) Find the conditional variance of X2 given (X1 , X3 ) = (x, y), 0 < y <
2
x < 1; (d) Find P (X2 < X21 ) and P (X3 = 2X2 > X1
2
); (e) Find ρ(X1 , X2 ); (f) Are
X1 , X2 , X3 independent?.
where A = {(1, 0, 0), (0, 1, 0), (0, 0, 1), (1, 1, 1)}. (a) Are X1 , X2 , X3 independent?;
(b) Are X1 , X2 , X3 pairwise independent?; (c) Are X1 + X2 and X3 independent?
1 1 2 2 2
1 2 2 2
fX (x1 , x2 , x3 ) = 3 e− 2 (x1 +x2 +x3 ) 1 + x1 x2 x3 e− 2 (x1 +x2 +x3 ) , −∞ < xi < ∞,
(2π) 2
and fX,Y (x, y) = 0, elsewhere. (a) Are X + Y and Z independent?; (b) Find ρ =
Corr(X+Y,Z).
13. Let X1 , . . . , Xn be n r.v.s with E(Xi ) = µi , Var(Xi ) = σi2 and ρij = Corr(Xi , Xj ),
i, j = 1, . . . , n, i 6= j. For real numbers ai , bi , i = 1, . . . , n, define Y = ni=1 ai Xi
P
14. Let X and Y be jointly distributed random variables with E(X) = E(Y ) =
0, E(X 2 ) = E(Y 2 ) = 2 and Corr(X, Y ) = 1/3. Find Corr( X3 + 2Y3 , 2X
3
+ Y3 ).
3
Pn Pn
16. Let (xi , yi ) ∈ R2 , i = 1, . . . , n be such that i=1 xi = i=1 yi = 0. Using a statistical
argument show that
n
!2 n
! n
!
X X X
xi y i ≤ x2i yi2 .
i=1 i=1 i=1
21. Let X1 , . . . , Xn be a random sample of continuous random variables and let X1:n <
X2:n · · · < Xn:n be the corresponding order statistics. If the expectation of X1 is
finite and the distribution of X1 is symmetric about µ ∈ (−∞, ∞), show that:
d
(a) Xr:n − µ = µ − Xn−r+1:n , r = 1, . . . , n; (b) E(Xr:n + Xn−r+1:n ) = 2µ; (c)
E(X n+1 :n ) = µ, if n is odd; (d) P (X n+1 :n > µ) = 0.5, if n is odd.
2 2
22. (a) Let X1 , . . . , Xn denote a random sample, where P (X1 > 0) = 1. Show that
X1 + X2 + · · · + Xk k
E = , k = 1, 2, . . . , n.
X1 + X 2 + · · · + X n n
(b) Let X1 , . . . , Xn be a random sample and let E(X1 ) be finite. Find the conditional
expectation E(X1 |X1 + · · · + Xn = t), where t ∈ R is such that the conditional
expectation is defined.
4
(c) Let X1 , . . . , Xn be a random sample of random variables. Find P (X1 < X2 <
· · · < Xr ), r = 2, 3, . . . , n.
23. Let X1 and X2 be independent and identically distributed random variables with
common p.m.f. (
θ(1 − θ)x−1 , if x = 1, 2, . . .
f (x) = ,
0, otherwise
where θ ∈ (0, 1). Let Y1 = min{X1 , X2 } and Y2 = max{X1 , X2 } − min{X1 , X2 }. (a)
Find the marginal p.m.f. of Y1 without finding the joint p.m.f. of Y = (Y1 , Y2 ); (b)
Find the marginal p.m.f. of Y2 without finding the joint p.m.f. of Y = (Y1 , Y2 ); (c)
Find the joint p.m.f. of Y = (Y1 , Y2 ); (d) Are Y1 and Y2 independent; (e) Using (c),
find the marginal p.m.f.s of Y1 and Y2 .
Let Y = |X1 | + X2 and Z = X2 . (a) Find the d.f. of Y and hence find its p.d.f.; (b)
Find the joint p.d.f. of (Y, Z) and hence find the marginal p.d.f.s of Y and Z; (c)
Are Y and Z independent?
5
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-III
Instructor: Neeraj Misra
(a) Find the distribution function of Y = X/(X + 1) and hence determine the
p.m.f. of Y ;
(b) Find the p.m.f. of Y = X/(X + 1) and hence determine the distribution
function of Y ;
(c) Find the mean and the variance of X.
(a) Find the distribution function of Y = X 2 and hence determine the p.d.f. of Y ;
(b) Find the probability density function of Y = X 2 and hence determine the
distribution function of Y ;
(c) Find the mean and the variance of X.
3. (a) Give an example of a discrete random variable X for which E(X) is finite but
E(X 2 ) is not finite;
(b) Give an example of a continuous random variable X for which E(X) is finite
but E(X 2 ) is not finite.
1
5. Let X be a random variable with p.d.f.
(
1, if 0 < x < 1
fX (x) = .
0, otherwise
√
Find the p.d.f.s of the following random variables: (a) Y1 = X; (b) Y2 = X 2 ; (c)
Y3 = 2X + 3; (d) Y4 = − ln X.
(a) Find the distribution function of Y and hence find its p.d.f.;
(b) Find the p.d.f. of Y directly (i.e., without finding the distribution function);
(c) Find the mean and the variance of Y .
2
(b) Let the random variable X have the p.d.f.
1 x2
fX (x) = √ e− 2 , −∞ < x < ∞,
2π
and let
−1, if X < 0
1
Y = 2
, if X = 0 .
1, if X > 0
10. (a) Let E(|X|β ) < ∞, for some β > 0. Then show that E(|X|α ) < ∞, ∀ α ∈ (0, β];
(b) Let X be a random variable with finite expectation. Show that limx→−∞ xFX (x) =
limx→∞ [x(1 − FX (x))] = 0, where FX is the distribution function of X;
(c) Let X be a random variable with limx→∞ [xα P (|X| > x)] = 0, for some α > 0.
Show that E(|X|β ) < ∞, ∀ β ∈ (0, α). What about E(|X|α )?
11. (a) Find the moments of the random variable that has the m.g.f. M (t) = (1 − t)−3 ,
t < 1;
(b) Let the random variable X have the m.g.f.
et − e−2t
M (t) = , for t 6= 0,
3t
(a) Find the m.g.f. of Xp and hence find the mean and variance of Xp , p ∈ (0, 1);
3
(b) Let Yp = n − Xp , p ∈ (0, 1). Using the m.g.f. of Xp show that the p.m.f. of Yp
is (
n y
y
q (1 − q)n−y , if y ∈ {0, 1, . . . , n}
fYp (y) = .
0, otherwise
13. (a) For any random variable X having the mean µ and finite second moment, show
that E((X − µ)2 ) ≤ E((X − c)2 ), ∀c ∈ R;
(b) Let X be a continuous random variable with distribution function FX that is
strictly increasing on its support. Let m be the median of (distribution of) X.
Show that E(|X − m|) ≤ E(|X − c|), ∀ c ∈ (−∞, ∞).
(b) Let α be a positive real number. Under the assumptions of (a), show that
Z ∞
α
E(X ) = α xα−1 P (X > x)dx;
0
(c) Let F (0) = G(0) = 0 and let F (t) ≥ G(t), ∀ t > 0, where F and G are
distribution functions of continuous random variables X and Y , respectively.
Show that E(X k ) ≤ E(Y k ), ∀ k > 0, provided the expectations exist.
15. (a) Let X be a random variable such that P (X ≤ 0) = 0 and let µ = E(X) be
finite. Show that P (X ≥ 2µ) ≤ 0.5;
(b) If X is a random variable such that E(X) = 3 and E(X 2 ) = 13, then determine
a lower bound for P (−2 < X < 8).
16. (a) An enquiry office receives, on an average, 25, 000 telephone calls a day. What
can you say about the probability that this office will receive at least 30, 000
telephone calls tomorrow?
(b) An enquiry office receives, on an average, 20, 000 telephone calls per day with
a variance of 2, 500 calls. What can be said about the probability that this
office will receive between 19, 900 and 20, 100 telephone calls tomorrow? What
can you say about the probability that this office will receive more than 20, 200
telephone calls tomorrow?
4
(a) Prove that P (X ≥ a) ≤ e−at M (t), 0 < t < h;
(b) Prove that P (X ≤ a) ≤ e−at M (t), −h < t < 0;
−1 3 −1
(c) Suppose that M (t) = 41 1 − 3t + 4 1 − 2t , t < 13 . Find P (X > 1).
18. Let µ ∈ R and σ > 0 be real constants and let Xµ,σ be a random variable having
p.d.f.
1 (x−µ)2
fXµ,σ (x) = √ e− 2σ2 , −∞ < x < ∞.
σ 2π
(a) Show that fXµ,σ is a p.d.f.;
(b) Show that the probability distribution function of Xµ,σ is symmetric about µ.
Hence find E(Xµ,σ );
(c) Find the m.g.f. of Xµ,σ and hence find the mean and variance of Xµ,σ ;
(d) Let Yµ,σ = aXµ,σ + b, where a 6= 0 and b are real constants. Using the m.g.f.
of Xµ,σ , show that the p.d.f. of Yµ,σ is
1 (y−(aµ+b))2
fYµ,σ (y) = √ e− 2a2 σ2 , −∞ < y < ∞.
|a|σ 2π
Show that the distribution of X is symmetric about a point µ. Find this point µ.
Also find E(X) and P (X > µ).
1 x2
f (x) = √ e− 2 , −∞ < x < ∞.
2π
d
Show that X = −X. Hence find E(X 3 ) and, P (X > 0).
21. (a) Let X be a random variable with E(X) = 1. Show that E(e−X ) ≥ 31 ;
(b) For pairs of positive real numbers (ai , bi ), i = 1, . . . , n and r ≥ 1, show that
n
! n
!r−1 n
!r
X X X
ari bi bi ≥ ai b i .
i=1 i=1 i=1
5
Hence show that, for any positive real number m,
n
! n
! n
!2
X X X
ai2m+1 ai ≥ am+1
i .
i=1 i=1 i=1
6
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-II
Instructor: Neeraj Misra
1 tan−1 (x)
and (iii) F3 (x) = 2
+ π
, −∞ < x < ∞.
4. Let X be a random variable with distribution function F (·). In each of the following
cases determine whether X is a discrete r.v. or a continuous r.v.. Also find the
1
p.d.f./p.m.f. of X:
0, if x < −2
1
3 , if −2≤x<0
(
1
0, if x < 0
(i) F (x) = 2
, if 0≤x<5 ; (ii) F (x) = −x
.
3
1 − e , if x ≥ 0
, if 5≤x<6
4
1, if x≥6
where k ∈ R.
(i) Find the value of constant k;
(ii) Show that X is a discrete r.v. and find its support;
(iii) Find the p.m.f. of X.
where c ∈ R.
(i) Find the value of constant c;
(ii) For positive integers m and n such that m < n, using the p.m.f. evaluate
2
P (X < m + 1), P (X ≥ m), P (m ≤ X < n) and P (m < X ≤ n);
(iii) Find the conditional probabilities P (X > 1|1 ≤ X < 4) and P (1 < X < 6|X ≥
3).
(iv) Determine the distribution function of X.
where k ∈ R.
(i) Find the value of constant k;
(ii) Using the p.d.f., evaluate P (X < 0), P (X ≤ 0), P (0 < X ≤ 14 ), P (0 ≤ X < 41 )
and P (− 18 ≤ X ≤ 14 );
(iii) Find the conditional probabilities P (X > 14 ||X| > 25 ) and P ( 18 < X < 52 | 10
1
<
1
X < 5 );
(iv) Find the distribution function F of X;
(v) Find the lower quartile, the median and the upper quartile of F .
3
Statistical Inference
Test Set 1
1. Let X ~ Ρ(λ ). Find unbiased estimators of (i ) λ 3 , (ii ) e − λ cos λ , (iii) sin λ . (iv) Show that
there does not exist unbiased estimators of 1/ λ , and exp{−1/ λ}.
2. Let X 1 , X 2 ,..., X n be a random sample from a N ( µ , σ 2 ) population. Find unbiased and
µ
consistent estimators of the signal to noise ration and quantile µ + bσ , where b is
σ
any given real.
3. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , 2θ ) population. Find an unbiased and
consistent estimator of θ .
4. Let X 1 , X 2 be a random sample from an exponential population with mean 1/ λ . Let
X1 + X 2
= T1 = , T2 X 1 X 2 . Show that T1 is unbiased and T2 is biased. Further, prove that
2
MSE (T2 ) ≤ Var (T1 ) .
5. Let T1 and T2 be unbiased estimators of θ with respective variances σ 12 and σ 22 and
cov(T1 , T2 ) = σ 12 (assumed to be known). Consider T = α T1 + (1 − α )T2 , 0 ≤ α ≤ 1. Show
that T is unbiased and find value of α for which Var (T ) is minimized.
6. Let X 1 , X 2 ,..., X n be a random sample from an Exp ( µ , σ ) population. Find the method of
moment estimators (MMEs) of µ and σ .
7. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density
βα β
f X ( x=
) , x > α , α > 0, β > 2. Find the method of moments estimators of α , β .
x β +1
8. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , θ ) population. Find the MME of θ .
9. Let X 1 , X 2 ,..., X n be a random sample from a lognormal population with density
1 1
f X=
( x) exp − 2 (log e x − µ ) 2 , x > 0. Find the MMEs of µ and σ 2 .
σ x 2π 2σ
10. Let X 1 , X 2 ,..., X n be a random sample from a double exponential ( µ , σ ) population. Find
the MMEs of µ and σ .
Hints and Solutions
T ( x=) 0, if= x 2m + 1,
= 1,= if x 4m,
= −1, if x =4m + 2, m =0,1, 2,
(iii) For this we have to solve estimating equation. However, we use Euler’s identity
to solve it.
∞ λ x 1 ∞ λk k π π
k
k π π
k
πx
=U ( x) (= 2) x sin , x 0,1, 2,
4
In Parts (iv) and (v) , we can show in a similar way that estimating equations do not
have any solutions.
1 n 1 n
=
∑
2. = =
Let X
X
n i 1=
i , and S 2
∑
n −1 i 1
( X i − X )2 .
σ2 (n − 1) S 2
Then X ~ N µ , , and W= ~ χ n2−1 . It can be seen that
n σ 2
n n−2
2
E (W 1/2 ) = 2 and E (W −1/2 ) = 2 . Using these, we get unbiased estimators
n −1 n −1
2
2 2
n −1 n −1
1 n −1 2 2 2 1 respectively. As
of = σ and as T1 = S and T2
σ 2 n n −1 n − 2 S
2 2
µ
X and S 2 are independently distributed, U1 = X T2 is unbiased for . Further,
σ
U= 2 X + bT1 is unbiased for µ + bσ . As X and S 2 are consistent for µ and σ 2
µ
respectively, U1 and U 2 are also consistent for and µ + bσ respectively.
σ
3θ 2X
3. =
As µ1′ = ,T is unbiased for θ . T is also consistent for θ .
2 3
1
4. As E ( X i ) = , T1 is unbiased. Also X 1 and X 2 are independent. So
λ
2
1 π π
( ) ( ) 1
2
=
E (T2 ) E =
X1 X 2 E=
( X1 ) = . Var (T1 ) = 2 .
2 λ 4λ 2λ
2
(=
MS ET2)
1
E X1 X 2 − =
2
E ( X1 X 2 ) − E
λ λ
( )
X1 X 2 +
1
λ2
2 π
= 2 1 −
λ 4
σ 22 − σ 12
5. The minimizing choice of α is obtained as .
σ 12 + σ 22 − 2σ 12
1 x−µ
f ( x)= exp − , x > µ , σ > 0. µ1′ =µ + σ , µ2′ =(µ +σ ) +σ 2 .
2
6.
σ σ
So µ =µ1′ − µ2′ − µ ′2 , σ = µ2′ − µ ′2 . The method of moments estimators for
µ and σ are therefore given by
1 n 1 n
=n i 1=
µˆ MM =
ni1
X− ∑ ( X i − X )2 , an σˆ MM =
d ∑ ( X i − X )2 .
βα βα 2 µ1′ µ2′ µ2′
7.=µ1′ = , µ′ . So α = , β = 1+
β −1 2 β − 2 µ2′ − µ 1′2 µ2′ − µ1′2
The method of moments estimators for α and β are therefore given by
n n
X ∑X i
2
∑X i
2
αˆ MM = i =1
, βˆMM = 1 + n
i =1
.
∑(X
n n
∑(X ∑X − X) 2
i − X) + 2
i
2
i
=i 1 =i 1 i =1
θ2 3 n 2
8. Since µ1′ = 0 , we consider µ2′ =
3
. So θˆMM = ∑ Xi
n i =1
µ ′2 µ′
9.=µ1′ e=
µ +σ 2 /2
, µ2′ e=
2 µ + 2σ 2
. So µ log
= 1 , σ
2
log 22 and the method of
µ′ µ1′
2
1 n 2
X2
n ∑ Xi
=µˆ MM log
= , σˆ 2
log i =1 2 .
1 n MM X
∑ X i2
n i =1
1 x−µ
10. f (=
x)
2σ
exp −
σ , x ∈ , µ ∈ , σ > 0. µ=
′ µ , µ=
1
′ µ 2 + 2σ 2 .
2
1
=
So µ µ= ′
1, σ ( µ2′ − µ ′2 ) . The method of moments estimators for µ and σ are
2
therefore given by
1 n
=µˆ MM X=
, and σˆ MM ∑ ( X i − X )2 .
2n i =1
Statistical Inference
Test Set 2
f X ( x) = 3
exp − , x > 0 . Find the MLEs of parameters.
2π x 2µ 2 x
k
7. Let ( X 1 , X 2 ,..., X k ) have a multinomial distribution with parameters n = ∑ X i ,
i =1
k
p1 , , pk ; 0 ≤ p1 , , pk ≤ 1, ∑ p j =
1, where n is known. Find the MLEs of p1 , , pk .
j =1
8. Let one observation be taken on a discrete random variable X with pmf p ( x | θ ) , given
below, where Θ ={1, 2,3} Find the MLE of θ .
θ
1 2 3
1 1/2 1/4 1/4
x 2 3/5 1/5 1/5
3 1/3 1/2 1/6
4 1/6 1/6 2/3
9. Let X 1 , X 2 ,..., X n be a random sample from the truncated double exponential distribution
with the density
e −| x|
= f X ( x) , | x | < θ , θ > 0.
2(1 − e −θ )
Find the MLE of θ .
10. Let X 1 , X 2 ,..., X n be a random sample from the Weibull distribution with the density
β
f X ( x) α β x β −1e −α x , x > 0, α > 0, β > 0.
=
Find MLE of α when β is known.
Hints and Solutions
1
1. The likelihood function is L(θ=
, x) , − θ < x(1) ≤ x(2) ≤ ≤ x( n ) < 2θ , θ > 0. Clearly
(3θ ) n
it is maximized with respect to θ , when θ takes its infimum. Hence,
X
θˆML max − X (1) , ( n ) .
=
2
β nα nβ
(α , β , x )
2. The likelihood function is L= n
, x(1) > α , α > 0, β > 2. L is maximized
(∏ xi ) β +1
i =1
with respect to α when α takes its maximum. Hence αˆ ML = X (1) . Using this we can
β n {x(1) }nβ
as L′( β , x )
rewrite the likelihood function= n
, β > 2. The log likelihood is
(∏ xi ) β +1
i =1
n
′ x ) n log β + nβ log x(1) − ( β + 1) log ∏ xi . This can be easily maximized
log L ( β , =
i =1
−1
1 X
with respect to β and we get βˆML = ∑ log (i ) .
n X (1)
max ( − X (1) , X ( n ) ) =
3. Arguing as in Sol. 1, we get θˆML = max | X i | .
1≤i ≤ n
n
∑ (log X i −µˆ ML ) 2 .
5. The maximum likelihood estimators are given by
1 n 1 n 1 n
µˆ=
1 X ,
=
µ
ˆ =
2 Y , σˆ =
2
1 ∑ i
ni1 =
( X − X ) 2
, σˆ =
2
2 ∑ i
n i 1=
(Y − Y ) 2
, ρ
=
ˆ ∑ ( X i − X )(Yi − Y ) / (σˆ1σˆ 2 ).
ni1
n
10. αˆ ML =
∑ xiβ
Statistical Inference
Test Set 3
2. Let X 1 , X 2 ,..., X n be a random sample from a discrete population with mass function
1−θ 1 θ
P( X = −1) = , P( X = 0) = , P( X =1) = , 0 < θ < 1.
2 2 2
Find FRC lower bound for the variance of an unbiased estimator of θ . Show that the
1
variance of the unbiased estimator X + is more than or equal to this bound.
2
3. Let X 1 , X 2 ,..., X n be a random sample from a population with density function
f ( x) =θ (1 + x) − (1+θ ) , x > 0, θ > 0. Find FRC lower bound for the variance of an
unbiased estimator of 1/ θ . Hence derive a UMVUE for 1/ θ .
8. Let X 1 , X 2 ,..., X n be a random sample from a double exponential population with the
1 −| x −θ |
density f= ( x) e , x ∈ , θ ∈ . Find a minimal sufficient statistic.
2
9. Let X 1 , X 2 ,..., X n be a random sample from a discrete uniform population with pmf
1
p=
( x) = , x 1, , θ , where θ is a positive integer. Find a minimal sufficient statistic.
θ
10. Let X 1 , X 2 ,..., X n be a random sample from a geometric population with pmf
(1 − p ) x −1 p , x =
f ( x) = 1, 2, , 0 < p < 1 . Find a minimal sufficient statistic.
11. Let X have a N (0, σ 2 ) distribution. Show that X is not complete, but X 2 is complete.
12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
n
x) λ e − λ x , x > 0, λ > 0. Show that Y = ∑ X i is complete.
f (=
i =1
13. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Show that Y = X (1) is complete.
f ( x=
θ 2
So FRC lower bound for the variance of an unbiased estimator of θ is .
n
1 θ2
Further T =
2n
∑ X i2 is unbiased for θ and Var (T ) =
n
. Hence T is UMVUE for θ .
1 1 1 + 4θ − 4θ 2
2. E ( X )= θ − . So T= X + is unbiased for θ . Var (T ) = .
2 2 4n
(θ − 1) −1 , x = −1
∂ ∂ log f
2
1
=log f ( x | θ ) = 0, x 0 So we get E =
∂θ θ −1 , ∂θ 2θ (1 − θ )
x = 1
2θ (1 − θ )
The FRC lower bound for the variance of an unbiased estimator of θ is .
n
2θ (1 − θ ) (1 − 2θ ) 2
It can be seen that Var (T ) − = ≥ 0.
n 4n
∂ 1
3. We have log f ( x | θ ) =− log(1 + x) . Further, E{log(1 + X )} = θ −1 , and
∂θ θ
∂ log f
2
1
E{log(1 + X )} = 2
2θ . So E
−2
= 2 , and FRC lower bound for the variance of
∂θ θ
θ2 1
an unbiased estimator of θ −1 is
n
.= Now T
n
∑ log(1 + X i ) is unbiased for θ −1 and
θ2
Var (T ) = .
n
4. Using Factorization Theorem, we get sufficient statistics in each case as below
n
n
(i) ∏ X i ,(ii) X (1) (iii) X (1) , ∏ X i
i =1 i =1
11. Since E ( X ) = 0 for all σ > 0 , but but P( X= 0)= 1 for all σ > 0 . Hence X is not
1
complete. Let W = X 2 . The pdf= e − w/2σ , w > 0 .
2
of W is fW ( w)
σ 2π w
∞
Eσ g (W ) 0 for all σ > 0 ⇒ ∫ g ( w) w−1/2 e − w/2σ dw =
Now= 0 for all σ > 0 . Uniqueness
2
( y ) n e n ( µ − x ) , x > µ , µ ∈ .
13. Note that the density of Y = X (1) is given by fY=
Eg (Y ) 0 for all µ ∈
=
∞
⇒ ∫ n g ( y )e n ( µ − y ) dy =
0 for all µ ∈
µ
∞
⇒ ∫ g ( y )e − ny dy =0 for all µ ∈
µ
14. Minimal sufficiency can be proved using Lehmann-Scheffe theorem. To see that
n
( X , S 2 ) is not complete, note that E S 2 0 for all θ > 0.
X 2 −=
n +1
n 2
However, P =
X 2 S= 0.
n +1
Statistical Inference
Test Set 4
P(=
X 1 0,=
S s)
= ∑ X i s)
P( X 1 0,=
P( X= 0 | S= s=
) = i =2
= P( S s=
1
) P( S s)
n
Using independence of X 1 and ∑X
i =2
i and the additive property of Poisson distribution, we
n −1
s
np − m − m np
E T = λ m , np > m and E λ −r
T r =
np np + r
7. is f ( y ) ne n ( µ − y ) , y > µ .
A complete and sufficient statistic is Y = X (1) . The density of Y=
1 2µ 2
We have E (Y ) =µ + and E (Y 2 ) =µ 2 + + 2 . Using these UMVUEs of µ and µ 2 are
n n n
1 2Y
Y− and Y 2 − .
n n
n
8. =
Let =
Y X (1) and Z ∑(X
i =1
i − Y ) . Then (Y , Z ) is complete and sufficient. Also, Y and Z are
σ 2Z
independently distributed with Y ~ Exp µ , and ~ χ 22n − 2 . Using these, UMVUEs of
n σ
Z Z
µ and σ are given by d1 = Y− and d 2 = respectively. So a UMVUE for
n(n − 1) n −1
quantile is d1 + bd 2 .
A UMVUE for R (t ) is h(Y=
, Z ) P ( X 1 > t | (Y , Z )) .
n−2
n −1 t − Y
=
It can be seen that h(Y , Z ) max 1 − , 0 .
n Z
T
9. Note that T = ∑ X i2 is a complete and sufficient statistic. Also W = ~ χ n2 . Let the loss
σ2
2
σ 2 −a σ −b
2
respectively. Clearly the two estimation problems are invariant under the scale group of
transformations, =
GS {g c : g c=
( x) cx, c > 0} on the space of X i s . Under the transformation
g c , note that σ 2 → c 2σ 2 , a → c 2 a, σ → cσ , b → cb. The form of a scale equivariant
estimator of σ 2 is d k (T ) = kT , where k is a positive constant. Minimizing the risk function
σ 2 E (T )
nσ 4 1 T
of d k with respect to k , we=
get k = = . Hence the best
E (T ) n(n + 2)σ
2 4
n+2 n+2
scale equivariant estimator of σ 2 . Similarly, the form of a scale equivariant estimator of σ
is U p (T ) = pT 1/2 , where p is a positive constant. Minimizing the risk function of U p with
n +1 2 n +1 n +1
σ E (T 1/2 ) 2 σ
=
respect to p , we get p = = 2 2 . So 2 T 1/2 is the best
E (T ) n n+2 n+2
nσ2 2 2
2 2 2
scale equivariant estimator of σ .
θ − a
2
n
) θ n exp −θ ∑ xi + 1 , xi > 0, θ > 0.
The joint pdf of X and θ is f *( x , θ=
i =1
n +1
The marginal density of X =
is then h( x ) , xi > 0.
(nx + 1) n +1
Hence the posterior density of θ given X = x is Gamma (n + 1, nx + 1) .
Note that
n +1 (n + 1)(n + 2) 1 nx + 1 1 (nx + 1)
2
E (θ | x ) =
= , E (θ 2 | x ) = , E x = , E 2 x .
nx + 1 (nx + 1) 2 θ n θ n(n − 1)
n +1
With respect to the loss function L1 , the Bayes estimator of θ is E (θ | X ) = .
nX + 1
1
E X
θ n −1
With respect to the loss function L2 , the Bayes estimator of θ is = .
1 nX + 1
E 2 X
θ
With respect to the loss function L3 , the Bayes estimator of θ is
(n + 1)(n + 2)
{E (θ | X )}
1/2
2
= .
(nX + 1)
1
12. The joint pdf of X = ( X 1 , X 2 ,..., X n ) is f ( x | θ )= , 0 < x(1) < < x( n ) < θ .
θn
αβ α
*( x , θ )
The joint pdf of X and θ is f= , θ > max{β , x( n ) }.
θ n +α +1
αβ α
=
The marginal density of X is then h( x ) n +α
, x( n ) > 0.
(n + α ) max{β , x( n ) }
Hence the posterior density of θ given X = x is
n +α
(n + α ) max{β , x( n ) }
=g *(θ | x ) , θ > max{β , x( n ) } .
θ n +α +1
With respect to the loss function L, the Bayes estimator of θ is
n +α
E (θ | X ) = max{β , X ( n ) } .
n + α −1
Statistical Inference
Test Set 5
1. In the following hypotheses testing problems identify the given hypotheses as simple or
composite.
(i) X ~ Exp (λ ), H 0 : λ ≤ 1
(ii) X ~ Bin(n, p ), n is known, H 0 : p = 1/ 3.
(iii) X ~ Gamma (r , λ ), r is known , H 0 : λ > 2 .
(iv) X 1 , , X n ~ N ( µ , σ 2 ), H=
0 :µ σ 2 2.
0,=
σ
4. Let X have Cauchy density=
fσ ( x ) , x ∈ , σ > 0. Find the most powerful test
π (σ + x 2 )
2
8. Based on a random sample of size n from Exp (λ ) population, derive UMP unbiased test of
size α for testing
= H 0 : λ 1 vs. H 0 : λ ≠ 1.
Based on a random sample of size n from double exponential population with density
9.
1 −| x|/σ
f=
X ( x) e , x ∈ , σ > 0 derive UMP unbiased test of size α for testing
2σ
H 0 : σ 1 vs. H 0 : σ ≠ 1.
=
10. =
For the set up in Q. 5, find the LRT for the testing H 0 : θ 2 vs. H 0 : θ ≠ 2 .
12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Find the LRT of size α for testing H 0 : µ ≤ 1 vs H1 : µ > 1.
f ( x=
13. Find out the group of transformations under which the following testing problems are
invariant:
(i) X ~ eθ − x , θ > x, H 0 : θ ≥ 3 vs. H1 : θ < 3
(ii) X ~ Exp (1/ σ ), H 0 : σ ≤ 1 vs. H1 : σ > 1
14. Let X 1 , X 2 ,..., X n be a random sample from an inverse Gaussian distribution with density
λ λ ( x − µ )2
1/2
f X ( x) = 3
exp − , x > 0 . Find the confidence intervals for the
2π x 2µ 2 x
parameters.
1.
(i) composite (ii) simple (iii) composite (iv) simple
5
=2. α =Pλ 1 =
( X > 2) =1 − e −1 , β =Pλ 4 ( X ≤ 2) =13e −4
2
α P(| X | >=
3. = 1) e . Power = Pσ (| X | > 1)= e −1/σ > e −1 , as σ > 1 .
−1
f 2 ( x)
4. Using NP Lemma, the most powerful test is to reject H 0 when =
R( x) > k . Now
f 2 ( x)
2(1 + x 2 ) 6x 1
R( x) = . It can be seen that R′( x) = . So R ( x) has a minimum at x = 0
(4 + x )
2
(4 + x )
2 2
2
and supremum 2 as x → ±∞. The most powerful test can then be designed as below:
1
(i) If we take k ≤ , then the MP test will always reject H 0 and α = 1 .
2
(ii) If we take k ≥ 2, then the MP test will always accept H 0 and α = 0 .
1
(iii) If we take < k < 2, then the MP test will reject H 0 when R( x) > k . This is equivalent
2
(4k − 2) 4
to | x | > . Applying the size condition, we get k = .
(2 − k ) 4 + 3cos(1 − α )
5. Using NP Lemma, we find that the MP test will reject H 0 when x < 1 − 1 − α .
12. The test will reject H 0 when X (1) > 1 − ln(α )1/ n .
14. We can use the complete sufficient statistics and their distributions to formulate the
confidence intervals.
5. n ≥ 1701.5625
6. 1 − φ(0.913) = 0.1814
7. n ≥ 132.45
9. Φ(1.3093)
20
12. p = 44/49, n ≈ 100
9-
1
13. (a) P (Y ≥ 900) ≤ 9 (b) P (Z ≥ 2) = 1 − Φ(2) = 0.0228
1
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 6 (Limiting Probabilities)
1. Let {Xn } be a sequence of independent random variables defined by
1 1
P {Xn = 0} = 1 − , and P {Xn = 1} = , n = 1, 2, . . . .
n n
a.s
(a) Find the distribution of X such that Xn −→ X
p
(b) Find the distribution of X such that Xn −→ X
2. For each n ≥ 1 , let Xn be an uniformly distributed random variable over set 0, n1 , n2 , · · · , n−1
n , 1 . Prove
that Xn convergence to U [0, 1] in distribution.
3. Let (Ω, =, P ) = ([0, 1] , B (R) ∩ [0, 1] , U ([0, 1])). Let {Xn , n = 2, . . .} be a sequence of random variables with
d d
Xn = U 21 − n1 , 12 + n1 . Prove or disprove that Xn −→ X with X = 12 .
Pn
4. Let X1 , X2 , . . . be a sequence of i.i.d. random variables such that Xi ∼ N (0, 1). Define Sn = i=1 Xi , n =
20
1, 2, . . .. Then, as n → ∞, Snn converges in probability to X. Find X?
5. Consider polling of n voters and record the fraction Sn of those polled who are in favour of a particular
9-
candidate. If p is the fraction of the entire voter population that supports this candidate, then Sn =
X1 +X2 +...+Xn
n , where Xi are i.i.d. random variables with B(1, p). How many voters should be sampled so
2 01
that we wish our estimate Sn to be within 0.02 of p with probability at least 0.90?
6. Suppose that 30 electronic devices say D1 , D2 , . . . , D30 are used in the following manner. As soon as D1
fails, D2 becomes operative. When D2 fails, D3 becomes operative etc. Assume that the time to failure of
Di is an exponentially distributed random variable with parameter = 0.1(hour)−1 . Let T be the total time
of operation of the 30 devices. What is the probability that T exceeds 350 hours?
er
7. Let X ∼ Bin (n, p). Use the CLT to find n such that: P [X > n/2] ≤ 1 − α. Calculate the value of n when
t
n
X ni
8. Use CLT to show that limn→∞ e−n ' 0.5.
em
i=0
i!
9. A person puts few one rupee coins into a piggy-bank each day. The number of one rupee coins added on any
given day is equally likely to be 1, 2, 3, 4, 5 or 6, and is independent from day to day. Find an approximate
IS
probability that it takes at least 80 days to collect 300 rupees? Final answer can be in terms of Φ(z) where
Rz 2
Φ(z) = √12π −∞ e−t /2 dt.
10. Suppose that Xi , i = 1, 2, . . . , 30 are independent random variables each having a Poisson distribution with
parameter 0.01. Let S = X1 + X2 + . . . + X30 .
(a) Using central limit theorem evaluate P (S ≥ 3).
(b) Compare the answer in (a) with exact value of this probability.
7 2
P P (Xi = 1) = 9 = 1 − P (Xi = 0). Let Yi = Xi + Xi ,
11. Let X1 , X2 , . . . be iid random variables, each having pmf
30
i = 1, 2, . . .. Use central limit theorem to evaluate P i=1 Yi > 60 approximately. Final answer can be in
z 2
terms of Φ(z) where Φ(z) = √12π −∞ e−t /2 dt.
R
12. Consider the dinning hall of Aravali Hostel, IIT Delhi which serves dinner to their hostel students only. They
are seated at 12-seat tables. The mess secretary observes over a long period of time that 95 percent of the
time there are between six and nine full tables of students, and the remainder of the time the numbers are
equally likely to fall above or below this range. Assume that each student decides to come with a given
probability p, and that the decisions are independent. How many students are there? What is p?
1
13. Let X1 , X2 , . . . be a sequence of independent and identically distributed random
P100 variables with mean 1 and
variance 1600, and assume that these variables are non-negative. Let Y = k=1 Xk .
(a) What does Markov’s inequality tell you about the probability P (Y ≥ 900).
(b) Use the central limit theorem to approximate the probability P (Y ≥ 900). Final answer can be in terms
Rz 2
of Φ(z) where Φ(z) = √12π −∞ e−t /2 dt.
14. A person stands on the street and sells newspapers. Assume that each of the people passing by buys a
newspaper independently with probability 31 . Let X denote the number of people passing past the seller during
the time until he sells his first 100 copies of the newspaper. Using CLT, find P (X ≤ 300) approximately.
Rz t2
Final answer can be in terms of φ(z) where φ(z) = √12π −∞ e− 2 dt.
15. Let X1 , X2 , . . . be iid random variables, each having Bernoulli distribution with parameter 8/9.
(a) Find the distribution of Yi = Xi + Xi2 , i = 1, 2, . . ..
P
20
(b) Use central limit theorem to evaluate P i=1 Yi > 20 approximately. Final answer can be in terms
Rz 2
of Φ(z) where Φ(z) = √12π 0 e−t /2 dt.
20
16. Let X1 , X2 , . . . , Xn be n independent Poisson distributed random variables with means 1, 2, . . . , n respectively.
9-
Find an x in terms of t such that
2
!
Sn − n2
17. Using MGF, find the limit of Binomial distribution with parameters n and p as n → ∞ such that np = λ so
er
that p → 0.
t
es
em
IS
2
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 5
Answer for selected Problems
20
4
5
8. 2 log(3/2)
9-
9. 31
σ2
10. (a) (b) σ 2
11. X2
3
n
1 1
14. E[X/X > y] = λ + y; E[X − y/X > y] = λ
15. X2 ∼ Binomial(n,
p2 ),
em
1−(1−q)n−1
np 1−(1−q)n ; p = q = 1/3
k
x 1
17. E(Y k /x) = k+1 , E(Y k ) = (k+1)
IS
y σ2 y
18. X/y ∼ N 1+σ 2 , 1+σ 2 , E(X/y) = 1+σ 2
19. 1/2
1 t t2
20. P (n) (t) = P (P (...(P (t))...)) where P (t) = 4 + 4 + 2. PZn (t) = [P (n) (t)]Zn . E(Z51 ) = 1250.
1
Pn Pn
25. µ = 2α i=1 i2 , σ 2 = α2 i=1 i2
1 2
E(W ) = eµ+ 2 σ
2 2
V ar(W ) = e2µ+σ (eσ − 1)
(ln w−µ)2
f (w) = √1
w 2πσ
e− 2σ 2 , w>0
20
9-
2 01
er
t
es
em
IS
2
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 5 (Moments)
1. Let X1 and X2 be independent exponential distributed random variables with parameters 5 and 4 respectively.
Define X(1) = min{X1 , X2 } and X(2) = max{X1 , X2 }.
(a) Find V ar(X(1) )? (b) Find the distribution of X(1) ? (c) Find E(X(2) )?
2. Let X and Y be two non-negative continuous random variables having respective CDFs FX and FY . Suppose
that for some constants a & b > 0, FX (x) = FY x−a
b . Determine E(X) in terms of E(Y ).
3. Let X be a random variable having an exponential distribution with parameter 12 . Let Z be a random
variable having a normal distribution with mean 0 and variance 1. Assume that, X and Z are independent
random variables. (a) Find the pdf of T = √ZX . (b) Compute E(T ) and V ar(T ).
2
4. Let X and Y be two identically distributed random variables with Var(X) and Var(Y ) exist. Prove or
disprove that Var X+Y
2 ≤ Var (X).
4
5. Let X and Y be i.i.d. random variables each having a N (0, 1). Calculate E[(X + Y ) /(X − Y )].
20
c(X1 − X2 )
6. Let X1 , . . . , X5 be a random sample from N (0, σ 2 ). Find a constant c such that Y = p has
9-
X32 + X42 + X52
a t-distribution. Also, find E(Y ).
01
7. Consider the metro train arrives at the station near your home every quarter hour starting at 5:00 AM. You
walk into the station every morning between 7:10 and 7:30 AM, with the time in this interval being a uniform
random variable, that is U ([7 : 10, 7 : 30]).
2
(a) Find the distribution of time you have to wait for the first train to arrive?
er
(b) Also, find its mean waiting time?
t
X
8. Let X and Y be iid random variables each having uniform distribution (2,3). Find E Y ?
es
1
9. Let X and Y be two random variables such that ρ(X, Y ) = 2, V ar(X) = 1 and V ar(Y ) = 4. Compute
V ar(X − 3Y ).
em
1
Pn
10. Let X1 , X2 , . . . , Xn be iid random variables with E(X1 ) = µ and V ar(X1 ) = σ 2 . Define X = n i=1 Xi , S 2 =
1
Pn 2
n−1 i=1 Xi − X . Find (a) V ar(X) (b) E[S 2 ].
IS
11. Pick the point (X, Y ) uniformly in the triangle {(x, y) | 0 ≤ x ≤ 1 and 0 ≤ y ≤ x}. Calculate E[(X −Y )2 /X].
( y
y − 1+x
(1+x)4 e , x, y ≥ 0
12. Find E(Y /x) where (X, Y ) is jointly distributed with joint pdf f (x, y) =
0, otherwise.
1
13. Let X have a beta distribution i.e., its pdf is fX (x) = β(a,b) xa−1 (1 − x)b−1 , 0 < x < 1 and Y given X = x
has binomial distribution with parameters (n, x). Find regression of X on Y. Is regression linear?
14. Let X ∼ EXP (λ). Find E[X/X > y] and E[X − y/X > y].
15. Consider n independent trials, where each trial results in outcome i with probability pi = 1/3, i = 1, 2, 3. Let
Xi denote the number of trials that result in outcome i amongst these n trials. Find the distribution of X2 .
Find the conditional expectation of X1 given X2 > 0. Also determine cov (X1 , X2 | X2 ≤ 1).
16. (a) Show that cov(X, Y ) = cov(X, E(Y | X)).
(b) Suppose that, for constants a and b, E(Y | X) = a + bX. Show that b = cov(X, Y )/V ar(X).
1
17. Let X be a random variable which is uniformly distributed over the interval (0, 1). Let Y be chosen from
1/x, 0 < y ≤ x
interval (0, X] according to the pdf f (y/x) = Find E(Y k /X) and E(Y k ) for any fixed
0, otherwise.
positive integer k.
18. Suppose that a signal X, standard normal distributed, is transmitted over a noisy channel so that the received
measurement is Y = X +W , where W follows normal distribution with mean 0 and variance σ 2 is independent
of X. Find fX/y (x/y) and E(X | Y = y).
19. Suppose X follows Exp(1). Given X = x, Y is a uniform distributed rv in the interval [0, x]. Find the value
of E(Y ).
20. Consider Bacteria reproduction by cell division. In any time t, a bacterium will either die (with probability
0.25), stay the same (with probability 0.25), or split into 2 parts (with probability 0.5). Assume bacteria act
independently and identically irrespective of the time. Write down the expression for the generating function
of the distribution of the size of the population at time t = n. Given that there are 1000 bacteria in the
population at time t = 50, what is the expected number of bacteria at time t = 51.
21. Let N be a positive integer random variable and X1 , X2 , . . . be a sequence of iid random variables. N is
20
independent of Xi ’s. Find the moment generating function (MGF) of SN = X1 + X2 + . . . + XN , the random
sum in terms of MGF of Xi0 s and N . Also show that:
(a) E[SN ] = E[N ]E[X] (b) V ar[SN ] = E[N ]V ar[X] + [E[X]]2 V ar[N ].
9-
22. If E[Y /X] = 1, show that V ar[XY ] ≥ V ar[X].
01
23. Suppose you participate in a chess tournament in which you play until you lose a game. Suppose you are a
very average player, each game is equally likely to be a win, a loss or a tie. You collect 2 points for each win,
2
1 point for each tie and 0 points for each loss. The outcome of each game is independent of the outcome of
every other game. Let Xi be the number of points you earn for game i and let Y equal the total number of
er
points earned in the tournament. Find the moment generating function MY (t) and hence compute E(Y ).
−y
e , 0<x<y<∞
24. Let (X, Y ) be two-dimensional random variable with joint pdf is given by f (x, y) =
t
0, otherwise
es
X1α X22α . . . Xnnα , α > 0 where α is any constant. Determine E(W ), V ar(W ) and the pdf of W .
26. Let (X, Y ) be a two-dimensional continuous type random variables. Assume that, E(X), E(Y ) and E(XY )
are exist. Suppose that, E(X | Y = y) does not depend on y. Find E(XY ).
27. For each fixed λ > 0, let X be a Poisson distributed random variable with parameter λ. Suppose λ itself is
a random variable following exponential distribution with parameter 1. Find the probability mass function
of X.
28. Let X and Y be two discrete random variables with
P (X = x1 ) = p1 , P (X = x2 ) = 1 − p1 , 0 < p1 < 1;
and
P (Y = y1 ) = p2 , P (Y = y2 ) = 1 − p2 , 0 < p2 < 1.
If the correlation coefficient between X and Y is zero, check whether X and Y are independent random
variables.
2
29. Suppose the length of a telephone conversation between two persons is a random variable X with cumulative
distribution function
0, −∞ < t < 0
P (X ≤ t) = ,
1 − e−0.04t , 0 ≤ t < ∞
where the time is measured in minutes.
(a) Given that the conversation has been going on for 20 minutes, compute the probability that it continues
for at least another 10 minutes.
(b) Show that, for any t > 0, E(X/X > t) = t + 25.
30. A real function g(x) is non-negative and satisfies the inequality g(x) ≥ b > 0 for all x ≥ a. Prove that for a
random variable X if E(g(X)) exists then P (X ≥ a) ≤ E(g(X))b .
λ
31. Let X have a Poisson distribution with mean λ ≥ 0, an integer. Show that P (0 < X < 2(λ + 1)) ≥ λ+1 .
32. Does the random variable X exist for which P [µ − 2σ ≤ X ≤ µ + 2σ] = 0.6? Justify your answer.
20
9-
2 01
t er
es
em
IS
3
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 4
Answer for selected Problems
1. (a) px (1) = 0.2, px (3) = 0.5 px (4) = 0.3 py (1) = 0.4, py (2) = 0.6 (b) 0.5
1 9 1
2. (a) 36 (b) 36 (c) 2
4. k = 18 , 5
81 , 1
2y2 √
5. (a) k = 4 (b) fY1 ,Y2 (y1 , y2 ) = y1 , 0 < y1 < 1, 0 < y2 < y1 < 1
e−λ(1−p) (λ(1−p))n
6. Let Y : r.v. denoting no. of 1’s transmitted. P (Y = n) = n! , n = 0, 1, . . .
35 −15 1
7. (a) 45 (b) 20
8. Yes
20
λµ
9. (µ+ν)(λ+µ+ν)
10. 0.3214
9-
11. 0.27
01
, k = 2, 3, . . .
j = 1, 2, . . . , k − 1 PY /X (k/j) = q k−j−1 p, k = j + 1, j + 2, . . .
(b) X ∼ B(1/2, 15), Y ∼ B(1/3, 15), Y /(X = j) ∼ B(2/3, 15 − j), X/(Y = k) ∼ B(3/4, 15 − k)
er
es
z , 0<z<1
14. Z = X + Y, fZ (z) = 2 − z, 1 ≤ z ≤ 2
0, otherwise
em
λ λz
15. fz (z) = 2e , z<0
λ −λz
2 e , z ≥ 0.
IS
16. fV (v) = − ln(v), 0 < v < 1; fW (w) = 2√1w , 0 < w < 1; fV,W (v, w) = fV (v)fW (w)
r, 0 < θ < π/4, 0 < r < sec θ or π/4 < θ < π/2, 0 < r < cosecθ
17. fR,θ (r, θ) =
1 0, otherwise
2
2 sec θ, 0 < θ < π/4
fθ (θ) =
1 2
2 cosec θ, π/4 < θ < π/2
π
2 r, 0<r<1
fR (r) = √
r(cosec−1 (r) − sec−1 (r)), 1 < r < 2
1
18. 8
z(1 − e−z )
0≤z<1
20. P (Z ≤ z) = .
(1 − e−1 )(e−1 − e−z ) 1 ≤ z < ∞
1
5
21. 9
29. 0.7222
30. P (Z ≤ z) = (1 − e−z ), 0 ≤ z < ∞
31. N µ2 + ρ σσ12 (x − µ1 ), σ22 (1 − ρ2 )
20
3 3
32. (a) 2 × 5
36
(b) 125
9-
2 01
t er
es
em
IS
2
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 4 (Random Vector)
1. Let X and Y be independent random variables. The range of X is {1, 3, 4} and the range of Y is {1, 2}. Partial
information on the probability mass function is as follows: pX (3) = 0.50; pY (2) = 0.60; pX,Y (4, 2) = 0.18 .
(a) Determine pX , pY and pX,Y completely. (b) Determine P (|X − Y | ≥ 2).
2. Let (X, Y ) be a two-dimensional discrete type random variables with joint pmf p(x, y) = cxy for x = 1, 2, 3
and y = 1, 2, 3 and equals zero otherwise. (a) Find c. (b) Find P (1 ≤ X ≤ 2, Y ≤ 2). (c) P (Y = 3).
0, x < 0, y < 0 or x + y < 1
3. Show that F (x, y) = is not a distribution function.
1, otherwise
kx(x − y), 0 < x < 2, −x < y < x
4. Suppose that (X, Y ) has joint pdf fXY (x, y) = . Find k. Evaluate
0, otherwise
1 3
P (X < 1/Y = 2 ) and P (Y < 2 /X = 1).
Kx1 x2 , 0 < x1 < 1, 0 < x2 < 1
5. Let (X1 , X2 ) be a random vector with the joint pdf f (x1 , x2 ) =
0, otherwise
20
(a) Find the value of K? (b) Define Y1 = X12 and Y2 = X1 X2 . Find the joint pdf of (Y1 , Y2 )
9-
6. Consider a transmitter sends out either a 0 with probability p, or a 1 with probability (1 − p), independently
of earlier transmissions. Assume that the number of transmissions within a given time interval is Poisson
01
distributed with parameter λ. Find the distribution of number of 1’s transmitted in that same time interval?
7. Let X1 , X2 , . . . , X5 be iid random variables each having uniform distributions in the interval (0, 1).
2
(a) Find the probability that min(X1 , X2 , . . . , X5 ) lies between (1/4, 3/4).
er
(b) Find the probability that X1 is the minimum and X5 is the maximum among these random variables?
8. Let X1 , X2 and X3 be iid random variables each having the probability density function
t
es
−x
e , 0<x<∞
f (x) =
0, otherwise
em
X1 +X2 X1
Find the joint distribution of (Y1 , Y2 , Y3 ) where Y1 = X1 + X2 + X3 , Y2 = X1 +X2 +X3 and Y3 = X1 +X2 . Are
Y1 and Y2 independent random variables? Justify your answer.
IS
1
1, 0 < x, y < 1
14. Suppose that (X, Y ) has joint pdf f (x, y) = . Find the pdf of X + Y .
0, otherwise
15. Romeo and Juliet have a date at a given time, and each, independently, will be late by an amount of time,
denoted by X and Y respectively, that is exponentially distributed with parameter λ. Find the pdf of, X −Y ,
the difference between their times of arrival?
16. Let X, Y and Z be independent and identically distributed random variables each having a uniform distri-
bution over the interval [0, 1]. Find the joint density function of (V, W ) where V = XY and W = Z 2 .
17. The random variable X represents the amplitude of cosine wave; Y represents the amplitude of a sine wave.
Both are independent and uniformly distributed over the interval (0,1). Let R represent the amplitude of
their resultant, i.e., R2 = X 2 + Y 2 and θ represent the phase angle of the resultant, i.e., θ = tan−1 (Y/X).
Find the joint and marginal pdfs of θ and R.
18. Let X and Y be continuous random variables having joint distribution which is uniform over the square
which has corners at (2, 2), (−2, 2), (−2, −2) and (2, −2). Determine P (|Y | > |X| + 1).
19. Suppose that, we choose a point (X, Y ) uniformly at random in E where E = {(x, y) | |x| + |y| ≤ 1}. Find
20
the joint pdf of (X, Y ).
20. Let X and Y be independent rvs with X follows Exp(1) and Y follows U [0, 1]. Let Z = max{X, Y }. Find
9-
P (Z ≤ z).
21. Mr. Ram and Mr. Ramesh agree to meet between 5:00 PM and 6:00 PM, with the understanding that each
−u
e , 0<x<y<z<u<∞
22. Let (X, Y, Z, U ) be a 4-dim rv with joint pdf fX,Y,Z,U (x, y, z, u) = . Find
er
0, otherwise
the marginal pdf of X.
t
23. Let A, B and C be independent random variables each with uniform distributed on interval (0, 1). What is
es
Processes course. The time for Aditya to complete the problem is exponential distributed with mean 5
minutes. The time for Aayush to complete the problem is exponential distributed with mean 3 minutes.
IS
(a) What is the probability that Aditya finishes the problem before Aayush?
(b) Given that Aditya requires more than 1 minutes, what is the probability that he finishes the problem
before Aayush?
(c) What is the probability that one of them finishes the problem a minute or more before the other one?
25. Let A, B and C be independent random variables. Suppose that A and C are uniformly distributed on [0, 1].
Further, B is uniformly distributed on [0, 2]. What is the probability that Ax2 + Bx + C = 0 has real roots?
26. Prove that the correlation coefficient between any two random variables X and Y lies in the interval [−1, 1].
27. For each fixed λ > 0, let X be a Poisson distributed random variable with parameter λ. Suppose λ itself is
a random variable following a gamma distribution with pdf
1 n−1 −λ
f (λ) = Γ(n) λ e , λ>0
0, otherwise
where n is a fixed positive constant. Find the pmf of the random variable X.
2
28. The number of pages N in a fax transmission has geometric distribution with mean 4. The number of bits
k in a fax page also has geometric distribution with mean 105 bits independent of any other page and the
number of pages. Find the probability distribution of total number of bits in fax transmission.
29. It is known that a IIT bus will arrive at random at Nilgiri hostel bus stop sometime between 8:30 A.M. and
8:45 A.M. Rahul decides that he will go at random to this location between these two times and will wait at
most 5 minutes for the bus. If he misses it, he will take the cycle rickshaw. What is the probability that he
will take the cycle rickshaw?
30. Let (X, Y ) be a two-dimensional continuous type random variables with joint pdf
20
where σ1 , σ2 > 0, | ρ |< 1 and
9-
" 2 2 #
1 x − µ1 (x − µ1 )(y − µ2 ) y − µ2
Q(x, y) = − 2ρ +
(1 − ρ2 )
32. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of the i.i.d. sequence
σ2
Z1 , Z2 , . . . , where Zn can take values in the set {−1, 1} such that P (Zn = −1) = 25 , P (Zn = 1) = 35 . Let
er
Xn = Xn−1 + Zn , n = 1, 2, . . ..
t
3
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 3
Answer for selected Problems
1
2n+1 , k=0
2
1. (i) P [| X |= k] = 2n+1 , k = 1, 2, . . . , n
0, otherwise
1
2n+1 , k=0
2
(ii) 2
P [X = k] = 2n+1 , k = 12 , 22 , . . . , n2
0, otherwise
1
2n+1 , k=1
2
(iii) 1
P [ |X|+1 = k] = 2n+1 , k = 12 , 13 , . . . , n1
0, otherwise
20
2. 0.51
3. Y is uniformly distributed random variable on the interval (a, b)
9-
−1 y−(a+µb)
2
4. (i)fY (y) = |b|√12πσ e 2 bσ , −∞<y <∞
(ii)fZ (z) =
(
√1 e
2πz
0,
−z
2 , z>0
otherwise
2 01
er
αey
5. fY (y) = (ey +1)α+1 , −∞<y <∞
t
6.
es
−1 √ √
λe√
2 y e
λ y
+ e−λ y , 0<y< 1
λ2
√
em
−1
7. fY (y) = λe√ −λ y 1
2 y e , λ2 <y<∞
0, otherwise
IS
e−y + 1 −1/y
8. (a) Y is a continuous type random variable. (b) fY (y) = y2 e , 0<y<1
.
0, otherwise
0, −∞ < y < 2
y
9. F (y) = 10 , 2≤y<4
1, 4≤y<∞
10. α = e−λ
q 4
1
11. fY (y) = 2
π | y | e− 2 y , −∞<y <∞
1
4 y, 0<y<1
√
1
12. fY (y) = 8 y, 1<y<9
√
0, otherwise
(
√1 , −1 < y < 1
13. fY (y) = π 1−y 2
0, otherwise
1
14. Z has mixed type distribution where pmf is given by
1
4, z = −1, 1
P [Z = z] =
0, otherwise
n
px q n−x
x
, x = 0, 1, . . . , r − 1
15. P [X = x] = Pr−1 n
pi q n−i
i
i=0
0, otherwise
(
1
exp(− 12 ( x−µ 2
σ ) ), α<x<β
20
√
16. fX (x) = 2πσ(φ( β−µ α−µ
σ )−φ( σ ))
0, otherwise
9-
17. exp(−3/8)
18. 1
2
24.
2 2
25. eσ t /2
IS
n − odd
n
0
E(X ] = n!
(n/2)!2n/2
σn n − even
2
26. P (−1.062 < X < 0.73) = 3
27.
28. Y = ln(X)
fX (x) = 2x1√π exp − 14 (ln x)2 , 0 < x < ∞
93
30. X = No. of games played, P (X = k) = pk (> 0), k = 4, 5, 6, 7 E(X) = 16
1
√ √
√ , − 3<y< 3
32. (a) fY (y) = 2 3 ; P (−2 < Y < 2) = 1.
0, otherwise
(b) 50
2
1
√ √
√ , − 3<y< 3
34. (a) fY (y) = 2 3 ; P (−2 < Y < 2) = 1.
0, otherwise
exp(t)−1
(b) 50 (c) t
37. 11.125
38. E[X 2 ] = λ + λ2 , V ar(X) = λ, E[X 3 ] = λ3 + 3λ2 + λ
20
9-
2 01
t er
es
em
IS
3
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 3 (Function of a Random Variable)
1. X has a uniform distribution over the set of integers {−n, −(n − 1), . . . , −1, 0, 1, . . . , (n − 1), n}. Find the
distribution of (i) |X| (ii) X 2 (iii) 1/1 + |X|.
2. Let P [X ≤ 0.49] = 0.75, where X is a continuous type RV with some CDF defined over (0,1). If Y = 1 − X,
Find k so that P [Y ≤ k] = 0.25.
3. Let X be uniformly distributed random variable on the interval (0, 1). Define Y = a + (b − a)X, a < b.
Find the distribution of Y .
2
4. If X has N (µ, σ 2 ), find the distribution of Y = a + bX, and Z = X−µ
σ .
αθ α
5. Let X be a random variable with pdf f (x) = (x+θ)α+1 , x > 0 where θ > 0 and α > 0. Find the distribution
of random variable Y = ln X
θ .
6. Suppose that X is a random variable having a geometric distribution with parameter p. Find the probability
20
mass function of X 2 and X + 3.
2
7. Let X be an random variable having an exponential distribution with parameter λ > 0. Let Y = X − λ1 .
9-
Find the pdf of Y .
−x X, X < 1
01
8. Suppose that X is a continuous random variable with pdf fX (x) = e for x > 0. Define Y =
10. Let X be the life length of an electron tube and suppose that X may be represented as a continuous random
es
variable which is exponentially distributed with parameter λ. Let pj = P (j ≤ X < j + 1). Show that pj is
of the form (1 − α)αj and determine α.
em
11. Consider a nonlinear amplifier whose input X and output Y are related by its transfer characteristic
1
X2, X>0
IS
Y = 1
−|X| 2 , X < 0
13. Let the phase X of a sine wave be uniformly distributed in the interval (− π2 , π2 ). Define Y = sin X. Find the
distribution of Y .
14. Let X be a random variable with uniform distribution in the interval (−π/2, π/2). Define
−1 X ≤ −π/4
Z= tan(X) −π/4 < X < π/4
1 X ≥ π/4.
1
16. Find pdf of a doubly truncated normal N (µ, σ 2 ) random variable, truncated to the left at X = α and to the
right at X = β, where α < β.
17. Assume that, the number of years a car will run is exponential distributed with mean 8. Suppose, Mr. Alok
buys a four years running used car today, what is the probability that it will still run after 3 years?
18. Let U be a uniform distributed rv on the interval [0, 1]. Find the probability that the quadratic equation
x2 + 4U x + 1 = 0 has two distinct real roots x1 and x2 ?
19. A club basketball team will play a 50-game season. Twenty four of these games are aginst class A teams and
26 are against class B teams. The outcomes of all the games are independent. The team will win each game
against a class A opponent with probability 0.4 and it will win each game against a class B opponent with
probability 0.6. Let XA and XB denote, respectively, the number of victories against class A and class B
teams. Let X denote its total victories in the season.
(a) What is the distributions of XA ?
(b) What is the relationship between XA , XB and X?
(c) What is the distribution of X?
20
20. Let X be a continuous type random variable with strictly increasing CDF FX .
9-
(a) What is the distributions of X?
(b) What is the distribution of the random variable Y = − ln(FX (X))?.
01
21. Let X be a continuous type random variable having the pdf f (x) =
2
0,
1
21
,
−∞ < x ≤ 0
0<x≤1
kx2 , 1 < x < ∞
(a) Find k?
er
1
(b) Find the pdf of Y = X?
t
22. In the claim office with one employee of a public service enterprise, it is known that the time (in minutes)
es
that the employee takes to take a claim from a client is a random variable which is exponential distribution
with mean 15 minutes. If you arrive at 12 noon to the claim office and in that moment there is no queue
em
but the employee is taking a claim from a client, what is the probability that you must wait for less than 5
minutes to talk to the employee? What is the mean spending time (including waiting time and claim time)
of you?
IS
(a) Find k?
(b) Determine E(X)
(c) Find the pdf of Y = X 2 ?
25. Let X be a random variable with N (0, σ 2 ). Find the moment generating function for the random variable
X. Deduce the moments of order n about zero for the random variable X from the above result.
2
26. The moment generating function (MGF) of a random variable X is given by MX (t) = 61 + 14 et + 13 e2t + 14 e3t .
If µ is the mean and σ 2 is the variance of X, what is the value of P (µ − σ < X < µ)?
[x]
q , x≥0
27. Let X be a rv such that P (X > x) = where 0 < q < 1 is a constant and [x] is integral part
1, x<0
of x. Determine the pmf/pdf of X as applicable to this case.
28. Let ln(X) be a normal distributed random variable with mean 0 and variance 2. Find the pdf of X?
29. Prove that, if X is a continuous type rv such that E(X r ) exists, then E(X s ) exists for all s < r.
30. Suppose that two teams are plying a series of games, each of which is independently won by team A with
probability 0.5 and by team B with probability 0.5. The winner of the series is the first team to win four
games. Find the expected number of games that are played.
31. Let Φ be the characteristic function of a random variable X. Prove that 1− | Φ(2u) |2 ≤ 4 1− | Φ(u) |2 .
32. (a) Let X be a uniformly distributed random variable on the interval [a, b] where −∞ < a < b < ∞. Find
the distribution of the random variable Y = X−µ
σ where µ = E(X) and σ 2 = V ar(X). Also, find
20
P (−2 < Y < 2).
(b) Suppose Shimla’s temperature is modeled as a random variable which follows normal distribution with
9-
mean 10 Celcius degrees and standard deviation 3 Celcius degrees. Find the mean if the temperature
of Shimla were expressed in Fahreneit degrees.
E
1
X +1
=
2 01
33. Let X be a random variable having a binomial distribution with parameters n and p. Prove that
1 − (1 − p)n+1
(n + 1)p
.
er
34. Let X be a continuous random variable with CDF FX (x). Define Y = FX (X).
t
35. Suppose that X is a continuous random variable having the following pdf:
ex
2 , x≤0
IS
f (x) = e−x
2 , x > 0.
3
Department of Mathematics
MTL 106 (Introduction to Probability and Stochastic Processes)
Tutorial Sheet No. 2
Answer for selected Problems
1. X(i) = i, i = 0, 1, 2
2. F = P (Ω)
3. a)No b)No c)Yes
4. (a) 0.002 (b) 0.7255
5
5. 9
20
5
x≥3
1 x≥3 1,
1 3 1
7. α = 10 ; β= 64 ; P (X < 3/X ≥ 2) =
9-
2
10.
n ' 4742
n
C1 p1 (1 − p)n−1 ]] ≥ 0.95
2 01
where p = 0.001
11. exp(−p)
er
12. (1 − 0.001)1200
t
2x
r22 −r12
, r1 ≤ x ≤ r2
15. fX (x) =
em
0, otherwise
17. 0.99997
18. (a) yes (b) no
IS
1
19. Mean= T1 1 − e− 4
1
Variance = T1 1 − e− 4
30. 1