0% found this document useful (0 votes)
3 views

Assignment

Uploaded by

amands.5153
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Assignment

Uploaded by

amands.5153
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 157

MSO 201a: Probability & Statistics

Quiz-l
20t9-2020-ll Semester

Time Allowed: 45 Minutes Maximum Marks:25

\*\od'el Sol-q'tdsnA

Problem No. 1:

ln a probability space (Q,T,P), let A, B and C be events such that P(A) =1,
p(B)=1, p(c)-!, P(AnD=1, P(Anc)=1, P(Bnc)=f ano

p(A n B n C) = 1. Let D be the event that exactly one the event among A,B and
C occurs, Derive the value of P(D).

S o0..lto"r
PtFnc )l
?tl) ?tnr+ I ($) + Itc)- ! L ttA^D)-t ItAna)+
= 5 hA9-ta
-r 3 ?tA^$Ac )

1-+
6o
$ flAFvs'
Ptu) z
Problem No. 2:

Let X be a random variable with


distribution function

(0, x< 0

It
l4'
o'x<t
F(x) =
1t
I 3'
rsx<z
\1, x2z
P(1 <x <2), P(1 S x <2) and P(1 <
Derive the values of P(X e [0 ,L,}j),
x <2). 3+2+2+2 = 9 Marks

I
\t'Sol**to'. I

ltxe\glu't)' [[I>o)+lLrzt)+ttr'>)
Fut- rtr-t)1 ut- rtr-l-)
\ Pto\-P1o-rl * [ T Ft

ot i;*t F"F-t'sl
s\l> -2
L-| Fa.-1
ffi
F CF)
?ttg x <t) = F(lr)- =

!Ug I i'r-\'z Fqt) - gtt'-): t-I


'h -- Ltr
g tr-) - eLt) :
L-t
Ptte x t.r-\ Cr3
Problem No.3:

Let F: IR + IR be defined
bY

0, x(0
x*4 03x<2
T,
x2+4b 2<x14
F(x) =
T'
x*c
'

43x36
T,
L, x>6
values of b and c so that F is
a
Find the
where b and c are real constants'
random variable'
distribution function of some

b) r
bz l-
brL ]-

---g|

L%J
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-V
Instructor: Neeraj Misra

1. For µ ∈ R and λ > 0, let Xµ,λ be a random variable having the p.d.f.
(
1 − x−µ
λ
e λ , if x ≥ µ
fµ,σ (x) = .
0, otherwise

(a) Find Cr (µ, λ) = E((X − µ)r ), r ∈ {1, 2, . . .} and µ0r (µ, λ) = E(Xµ,λ
r
), r ∈ {1, 2};

(b) For p ∈ (0, 1), find the p-th quantile ξp ≡ ξp (µ, λ) of the distribution of Xµ,λ
(Fµ,λ (ξp ) = p, where Fµ,λ is the distribution function of Xµ,λ );

(c) Find the lower quartile q1 (µ, λ), the median m(µ, λ) and the upper quartile
q3 (µ, λ) of the distribution of Xµ,λ ;

(d) Find the mode m0 (µ, λ) of the distribution of Xµ,σ ;

(e) Find the standard deviation σ(µ, λ), the mean deviation about median MD(m(µ, λ)),
the inter-quartile range IQR(µ, λ), the quartile deviation (or semi-inter-quartile
range) QD(µ, λ), the coefficient of quartile deviation CQD(µ, λ) and the coefficient
of variation CV(µ, λ) of the distribution of Xµ,λ ;

(f) Find the coefficient of skewness β1 (µ, λ) and the Yule coefficient of skewness
β2 (µ, λ) of the distribution of Xµ,λ ;

(g) Find the excess kurtosis γ2 (µ, λ) of the distribution of Xµ,λ ;

(h) Based on values of measures of skewness and the kurtosis of the distribution of
Xµ,λ , comment on the shape of fµ,σ .

2. Let X be a random variable with p.d.f. fX (x) that is symmetric about µ (∈ R),
i.e., fX (x + µ) = fX (µ − x), ∀ x ∈ (−∞, ∞).

(a) If q1 , m and q3 are respectively the lower quartile, the median and the upper
quartile of the distribution of X then show that µ = m = q1 +q2
3
;
q1 +q3
(b) If E(X) is finite then show that E(X) = µ = m = 2
.

3. A fair dice is rolled six times independently. Find the probability that on two
occasions we get an upper face with 2 or 3 dots.

1
4. Let n (≥ 2) and r ∈ {1, 2, , n − 1} be fixed integers and let p ∈ (0, 1) be a fixed real
number. Using probabilistic arguments show that

n   n−1    
X n j n−j
X n−1 j n−1−j n−1 r
p (1 − p) − p (1 − p) = p (1 − p)n−r .
j=r
j j=r
j r

5. Let X ∼ Bin(n, p), where n is a positive integer and p ∈ (0, 1). Find the mode of
the distribution of X.

6. Eighteen balls are placed at random in seven boxes that are labeled B1 , . . . , B7 .
Find the probability that boxes with labels B1 , B2 and B3 all together contain six
balls.

7. Let T be a discrete type random variable with support ST = {0, 1, 2, . . .}. Show that
T has the lack of memory property if, and only if, T ∼ Ge(p), for some p ∈ (0, 1).

8. A person repeatedly rolls a fair dice independently until an upper face with two or
three dots is observed twice. Find the probability that the person would require
eights rolls to achieve this.

9. (a) Consider a sequence of independent Bernoulli trials with probability of success


in each trial being p ∈ (0, 1). Let Z denote the number of trials required to get the
r-th success, where r is a given positive integer. Let X = Z − r. Find the marginal
probability distributions of X and Z. For r = 1, show that P (Z > m + n) = P (Z >
m)P (Z > n), m, n ∈ {0, 1, . . .} (or equivalently P (Z > m + n|Z > m) = P (Z >
n), m, n ∈ {0, 1, . . .}; this property is also known as the lack of memory property).
(b) Two teams (say Team A and Team B) play a series of games until one team
wins 5 games. If the probability of Team A (Team B) winning any game is 0.7 (0.3),
find the probability that the series will end in 8 games.

10. (Relation between binomial and negative binomial probabilities) Let n be


a positive integer, r ∈ {1, 2, . . . , n} and let p ∈ (0, 1). Using probabilistic arguments
and also otherwise show that
n   n−r  
X n k n−k r
X r+k−1
p (1 − p) =p . (1 − p)k ,
k=r
k k=0
k

i.e., for r ∈ {1, 2, . . . , n}, P (Bin(n, p) ≥ r) = P (NB(r, p) ≤ n − r).

11. A mathematician carries at all times two match boxes, one in his left pocket and
one in his right pocket. To begin with each match box contains n matches. Each

2
time the mathematician needs a match he is equally likely to take it from either
pocket. Consider the moment when the mathematician for the first time discovers
that one of the match boxes is empty. Find the probability that at that moment
the other box contains exactly k matches, where k ∈ {0, 1, . . . , n}.

12. Consider a population comprising of N (≥ 2) units out of which a (∈ {1, 2, . . . , N −


1}) are labeled as S (success) and N − a are labeled as F (failure). A sample of
size n (∈ {1, 2, . . . , N − 1}) is drawn from this population, drawing one unit at a
time and without replacing it back into the population (i.e., sampling is without
replacement). Let Ai (i = 1, 2, . . . , n) denote the probability of obtaining success
(S) in the i-th trial. Show that P (Ai ) = Na , i = 1, . . . , n (i.e., probability of
obtaining S remains the same in each trial). Hint: Use induction by first showing
that P (A1 ) = P (A2 ) = Na .

13. (a) (Binomial approximation to hypergeometric distribution) Show that a


hypergeometric distributed can be approximated by a Bin(n, p) distribution pro-
vided N is large (N → ∞), a is large (a → ∞) and a/N → p, as N → ∞;
(b) Out of 120 applicants for a job, only 80 are qualified for the job. Out of 120
applicants, five are selected at random for the interview. Find the probability that
at least two selected applicants will be qualified for the job. Using Problem 13 (a)
find an approximate value of this probability.

14. Urn Ui (i = 1, 2) contains Ni balls out of which ri are red and Ni − ri are black.
A sample of n (1 ≤ n ≤ N1 ) balls is chosen at random (without replacement) from
urn U1 and all the balls in the selected sample are transferred to urn U2 . After the
transfer two balls are drawn at random from the urn U2 . Find the probability that
both the balls drawn from urn U2 are red.

15. Let X ∼ Po(λ), where λ > 0. Find the mode of the distribution of X.

16. (a) (Poisson approximation to negative binomial distribution) Show that


−λ k
limr→∞ r+k−1 pr (1 − pr )k = e k!λ , k = 0, 1, 2, . . ., provided limr→∞ pr = 1 and
 r
k
limr→∞ (r(1 − pr )) = λ > 0.
(b) Consider a person who plays a series of 2500 games independently. If the proba-
bility of person winning any game is 0.002, find the probability that the person will
win at least two games.

17. (a) If X ∼ Po(λ), find E((2 + X)−1 );


(b) If X ∼ Ge(p) and r is a positive integer, find E(min(X, r)).

3
18. A person has to open a lock whose key is lost among a set of N keys. Assume that
out of these N keys only one can open the lock. To open the lock the person tries
keys one by one by choosing, at each attempt, one of the keys at random from the
unattempted keys. The unsuccessful keys are not considered for future attempts.
Let Y denote the number of attempts the person will have to make to open the lock.
Show that Y ∼ U ({1, 2, . . . , N }) and hence find the mean and the variance of the
r.v. Y .

19. Let a > 0 be a real constant. A point X is chosen at random on the interval (0, a)
(i.e., X ∼ U (0, a)).
(a) If Y denotes the area of equilateral triangle having sides of length X, find the
mean and variance of Y ;
(b) If the point X divides the interval (0, a) into subintervals I1 = (0, X) and
I2 = [X, a), find the probability that the larger of these two subintervals is at least
the double the size of the smaller subinterval.

20. Using a random observation U ∼ U (0, 1), describe a method to generate a random
observation X from the distribution having
(a) probability density function

e−|x|
f (x) = , −∞ < x < ∞;
2

(b) probability mass function


(
n
 x
x
θ (1 − θ)n−x , if x ∈ {0, 1, . . . , n}
g(x) = ,
0, otherwise

where n ∈ N and θ ∈ (0, 1) are real constants.

21. Let X ∼ U (0, θ), where θ is a positive integer. Let Y = X − [X], where [x] is the
largest integer ≤ x. Show that Y ∼ U (0, 1).

22. Let U ∼ U (0, 1) and let X be the root of the equation 3t2 − 2t3 − U = 0. Show that
X has p.d.f. f (x) = 6x(1 − x), if 0 ≤ x ≤ 1, = 0, otherwise.

23. (Relation between gamma and Poisson probabilities) For t > 0, θ > 0 and
n ∈ {1, 2, . . .}, using integration by parts, show that

∞ n−1 − t t j
e θ( )
Z
1 − xθ n−1
X
θ
n
e x dx = ,
θ Γ(n) t j=0
j!

i.e., P (G(n, θ) > t) = P (Po( θt ) ≤ n − 1).

4
24. Let Y be a random variable of continuous type with FY (0) = 0, where FY (·) is the
distribution function of Y . Show that Y has the lack of memory property if, and
only if, Y ∼ Exp(θ), for some θ > 0.

25. (Relation between binomial and beta probabilities) Let n be a positive


integer, k ∈ {0, 1, . . . , n} and let p ∈ (0, 1). If X ∼ Bin(n, p) and Y ∼ Be(k, n − k +
1), show that P ({X ≥ k}) = P ({Y ≤ p}).

26. Let X = (X1 , X2 )0 have the joint p.d.f.

fX (x1 , x2 ) = φ(x1 )φ(x2 )[1+α(2Φ(x1 )−1)(2Φ(x2 )−1)], −∞ < xi < ∞, i = 1, 2, |α| ≤ 1.

(a) Verify that fX (x1 , x2 ) is a p.d.f.;


(b) Find the marginal p.d.f.s of X1 and X2 ;
(c) Is (X1 , X2 ) jointly normal?

27. Let X = (X1 , X2 )0 ∼ N2 (µ1 , µ2 , σ12 , σ22 , ρ) and, for constants a1 , a2 , a3 and a4 (ai 6= 0,
i = 1, 2, 3, 4, a1 a4 6= a2 a3 ), let Y = a1 X1 + a2 X2 and Z = a3 X1 + a4 X2 .
(a) Find the joint p.d.f. of (Y, Z);
(b) Find the marginal p.d.f.s. of Y and Z.

28. (a) Let (X, Y )0 ∼ N2 (5, 8, 16, 9, 0.6). Find P ({5 < Y < 11}|{X = 2}), P ({4 < X <
6}) and P ({7 < Y < 9});
(b) Let (X, Y )0 ∼ N2 (5, 10, 1, 25, ρ), where ρ > 0. If P ({4 < Y < 16}|{X = 5}) =
0.954, determine ρ.

29. Let X and Y be independent and identically distributed N (0, σ 2 ) r.v.s.


(a) Find the joint p.d.f. of (U, V ), where U = aX + bY and V = bX − aY (a 6=
0, b 6= 0);
(b) Show that U and V are independent;
X+Y X−Y
(c) Show that √
2
and √
2
are independent N (0, σ 2 ) random variables.

30. Let X = (X1 , X2 )0 ∼ N2 (0, 0, 1, 1, ρ).


(a) Find the m.g.f. of Y = X1 X2 ;
(b) Using (a), find E(X12 X22 );
(c) Using conditional expectation, find E(X12 X22 ).

5
31. Let X = (X1 , X2 )0 have the joint p.d.f.
(
1 − 12 (x2 +y 2 )
π
e , if xy > 0
f (x, y) = .
0, otherwise

Show that marginals of f (·, ·) are each N (0, 1) but f (·, ·) is not the p.d.f. of a
bivariate normal distribution.

32. Let fr (·, ·), −1 < r < 1, denote the pdf of N2 (0, 0, 1, 1, r) and, for a fixed ρ ∈ (−1, 1),
let the random variable (X, Y ) have the joint p.d.f.

1
gρ (x, y) = [fρ (x, y) + f−ρ (x, y)].
2

Show that X and Y are normally distributed but the distribution of (X, Y ) is not
bivariate normal.

33. Consider the random vector (X, Y ) as defined in Problem 32.


(a) Find Corr(X, Y );
(b) Are X and Y independent?

34. (a) Suppose that X = (X1 , X2 , X3 , X4 ) ∼ Mult(30, θ1 , θ2 , θ3 , θ4 ). Find the con-


ditional probability mass function of (X1 , X2 , X3 , X4 ) given that X5 = 2, where
X5 = 30 − 4i=1 Xi .
P

(b) Consider 7 independent casts of a pair of fair dice. Find the probability that
once we get a sum of 12 and twice we get a sum of 8.
n2
35. (a) Let X ∼ Fn1 ,n2 . Show that Y = n2 +n 1X
∼ Be( n21 , n22 );
(b) Let Z1 and Z2 be i.i.d. N (0, 1) random variables. Show that |ZZ12 | ∼ t1 and
Z1
Z2
∼ t1 (Cauchy distribution);
(c) Let X1 and X2 be i.i.d. Exp(θ) random variables, where θ > 0. Show that
Z=X 1
X2
has a F -distribution;
(d) Let X1 , X2 and X3 be independent random variables with Xi ∼ χ2ni , i = 1, 2, 3.
X1 /n1
Show that X 2 /n2
and (X1 +XX23)/(n
/n3
1 +n2 )
are independent F -variables.

36. Let X1 , . . . , Xn be a random sample from Exp(θ) distribution, where θ > 0. Let
X1:n ≤ · · · ≤ Xn:n denote the order statistics of X1 , . . . , Xn . Define Z1 = nX1:n , Zi =
(n − i + 1)(Xi:n − Xi−1:n ), i = 2, . . . , n. Show that Z1 , . . . , Zn are independent and
identically distributed Exp(θ) random variables. Hence find the mean and variance
of Xr:n , r = 1, . . . , n. Also, for 1 ≤ r < s ≤ n, find Cov(Xr:n , Xs:n ).

6
~~) ~,\~_,)\))~ C&.....J. \\\~>.)')v ~ \),·1\\v\\,...:kt ... .., ~ x,_.cr-- ~ ~"~~J A~c.w~
.::::.') ~ Q.-.1\ .)..-..'-)~V .-7.:..<'-1) ~:;:)\.. ~ 'YE!_I)
)'I~

"'b\1\i..... q,-..:kt~"\ ~ .; XA_,_ \I') ~(.\>-~·. .'\::. "'.V~( <:


~M. . r- '-'\ v...,.co.v(. \t.A~cL..l ()..vov.'-'~ M
c}.\ 1\.I{...J\. b4i vu .

\ ~"c ~ u""' ~'\) . tJ ~--~ -"-"" ~)l' >-~,....) -;. ~)1. \'-'_.._ '., -v )... ~ \~ ~ x-M 4- ~-..c.
Cta.) 'f.-l..l.. t ,_.._X. -:') ~\')(-U.~o) -;.-~Ut.-JI.._!:v) ~) f',.\.M)""t' ~"'lM-);: \
~ ~\.M.-') ~-\_ ~ ~\M) lf\\"'(.c. ~.,..\1-J') ~ f-,.\-\.1\)

tl~c:..~)) ~~ ~ ~\..~~v,) ~ ~lt.-M.<.v~-~> ~.).'"' ~ V\~-4~~-J....)


-;::..) ~ l }-1.-J<- < '\})-).\) ~-\ 5_ ~ \ J-..-1- ~ '-.J'!>-M) ( f\\"'-U. '/--A.a.~ J-\-JC..)
\ ,_ ~ ,_ l 'LM- "\1)) ~ .)_. ~ \- fJ'- t Q.X-"V))- )
"'\
h. l \'lJ-\-'\1) ) - ) ~ -\ <

-;:,.) }' -;.- ~q_ -,.. '\1 ,-t v~


1-

t:: l x-...,) ~ c u~ --1-) ~ tt::.l ')t \:... J.A -; '\kt v ~


)_

\ ~vo ~U""'"' \-.\~JJ G.~"'~ d~'"") ()."" v..v't•" t~u U>\~


1\'-'.t.( (. M l S') .
~ l s) ~ ~
OV)
lla."'
X. -; \:\ ~ """ c..c C-M-U) v..... b
\\. ~\v<...J.. \ho\.. .J., \ ~l· ~ = ' ) l X -;. 2. ) ;.
c~\t\1\\~ <>- "~U...u ~ l)<vho....Jl\. -\-vl~ \.,':)\" ~ "-'""-<rc.A/'1
\\vo~ ~\ 9- '-" -\-1 t"" e._, u.. -*""' ctG cv. ~ . ~~~""
X"--;:. ~ ~ 1\......_C<t.AAV\ '""'""" ~\""" 1.-., -\-v,...J..,
)("--\.-....- ~" f\'-1..<.<~eA \"" ~(v'l'\-'1 l'-' ...\) -'v'cJ-,
T~ 'f....._ ("'J ~\-- l "'., ) ) Cl.'-"- '}........\ ('.J ~~... t ~- '... v1 ~~._
)("" ~ y_._..., ~ \-... ~\.......a ' \ ..... rv t)\"" t~ \1) c>..l.-..l '/. ..._ 1 CA...! 'i._ cw~ "-'-'ty-\ .
/

::: ~ \. )("'"-t'\"' ~ Y) ~ ~ l .;...._..., -t\"'~ v.; \"'-;_'t)) ~ ~ l ih.--1-tl..}"': ""'.,.')


til..,_ /_V '"'-:;::.~)--t- V\ ){...,_.. 1 ...,._,_ \.,_-;..\)
:.. T\"'"---' ./ "

:;.
~ \ y_..,.., 1 ~ v ) ~ t' . . . ";. ~) '"'t" ~ L i "'""" ~ v _, ) ~ l'\"' '"\)

~\ J(.,..., ~ v i,.!-y)-\-" ' \ '/.....,""" ~ v.-\) p

--
\~v~~~t- \-\:.. {] ~ll \?--\-:
\ ;ot. ~ )..~) ) ') ~ )( \f.-\ (?)

~ '>-. <... \""~ ' ) 'p- ' .- . -. (~ I


\ "' t ~~ \ ) <... ~"' \ )-. ' E=-\ ).._ ") l \,. ... \) ) _, - - ,. - l ~)
ln.,_\) l \'1\ {)."" \."'-~<-\(.v

\,.lt_ l'-"~ \"""" (~) {).vJ. (D\ \ ~~""'\ \._U(.t-- ~ ~c.cl ~~ ~ lX ;.lh·q} p-')
-:. ~ lx~ Lc....-+-,) 1' )

< H"X-rl"-.~\)y-\)-;...\1 l.)(-:..lh-t\)\.))Ut ~ \,


r r '- X -\" ..., > r,. '))
• - ' ) l t X~"'-)
V\\ ~ .) ""' )\.) :: ~ ~~ \'1.-;.- 'p) \> lt. -;... ~\\ ~j ) \> \ T~ ~) \t o \<: S
.I .... i .
L «>v..V{ ""t.!' 1\~~::>NJ ~ l .t_~ L on ~v-~ c.,.~7_ L· <...

~\l"~)~\l.')-;- 9~\3)) V \7~ ~ \ '>.t ~.... "- ~~c


/

~ V\\~) 1-\ ) ~ ~\\~)) Y~T~')


-::::. ~\\~~-\) lVC..\~\))'-

)~\
V~\~o) (~l""C?.\)
C.'r ~ ))'t\
)

\'-::- VLT-;.o') c- lo__ ' ' •


\(>v \\.. G:; ~ .._ ~ 'l.._ , , • • 'j

~ t\-;.. \L ) : ~ \ \ ~ \1.. \ - v\ \~ \1.-(' \ ) \L.


-;. . \\' - v\< ~\ ?- ,., l \-l )

~ \ ~ ~<... l.~ \ .
~~ ~ o v...kc-'-"""e -a\ bbA f v\1\"'-.)
~~vo~q~ H ~ . P] ~ -e_.,.<....A.-
_
::.._...;,
~UK>
~ 1
u .
't:>V - r ....vu
! o:Afl oJ'J f\v...<..!_(J./} Cu.. .A C~I\ C:Vvt i
~ u..\>~ v.. \ c..<<2 ~~""" ~c.u ~~\)~ . T~ H S);. {_ .
D...t.A> o -\1..< v o ~ Lo""-'\ &-a. M :J

'f... ";- \\- ~ \ ~ ~I.A." &I) ~ ¥ (. c..e...l.{~ -}(..l ):: ~ f\ '< c. c <M
rJ ~ ~ l l-/ ~ )

R_~,v .....A \lvo\ _ ?\ 'j.. "'\" l. ::. ~)

~ ~ l )(. ?- b ) ~ (~ ) ~ \L (_ \.- ~)b ~


\' \ t 7 ~-t .._ \ :. ~ l X I ""~ ..._ _, ) -:... ~ \ 't. '>; ""'~""" ) -=- \> \ ')(. ~ ..-) V \ }( "? '-\ )
-; \' l t:. ~ \M~\) \\ \ l~ "'~' )
-, ~\~"")"""') ~\,';l."J~) .

l~)
~\ ';. ~ ~ ~~U\ ~'-'"'\ 1\ ~0 Q_..wc_ 4- ~l6,.J ~ .A<t:..v..vc. S.l(\..W L'"'I

'}-.)..-;. VA\~ dC:0.Y-..U, 4-<..'1\... ~ \..)'\.).). 1..cVJ(.. .-\-- yl.,)_..r.. A<L-.."<.5~<....ll._,


R~,..,~ \)"-\ ;. r -;.. ~ \ -;..,, 8 ') '"' 1 l "'L -;- P )
-;. ~ t,, ; \>) --t ~ l ~" ?- \ ) )
~

~ \....J. " ' ~ !(.,- 5 \... \\. ~ lt 1.- 5 ) \1\ ~ "'""'"""'\.,~ e\ ~ """" )... . v '-1
\)\IH.-<.J.l""{ ~ $"~ j\.I..I...<.C~ \~v 'f~- />.. \_\<.~--~') ~ ~
1\~...t<..C..~ ~..., ~ -\(..-.'""" . ..... (.,,n'"' . T\...... ""'' N "'~I.. s-... () -~ )
~ "\1. "'-> \---\ "'- s-__ c . )) .
\) ~ ')(, -;. ? ) -..... ~ \ \ '?-)) -- c~) lO ·";t ,s- \_ \- ~-~)~ ~ . \)~'?
\>l~1. ";- ~) -;.. ~l'·h ~ ~') ~ \~ )~o ·)))\.\:-l:l · >f~o . o1-\L
{\.~,v~ ~v·'!. ::. V<.Y-t-;-2) ~~\lt.." ;; 9): 0 · \~~ l~v~o).. , ,

\Jvb~Qe,A ~'.). \~ c-.."'-"~" ().. l\~if'\€-""CL ~ t .... AL.t<.c..t'*~""" \).tZv"-'"",U\ ~" < .,J..}) W~~
)v•\~.:\ ~ ~1 ~ {\ ..... C.C(~.I\ ~"" '{.".C.I.. kvt j \:,(i. ....) P. T\...-.
L"" ~~) \~ \. '-\)"""~ -;- Vl A.~ ~eo.J\t Y f\"<C(Al\!.h {...,. "' ~ (V\'\Q ~' -\"' ~ )

)..~v
~-"
- ~l L ~ Y~ 1\v.cc('AJ\ t:.._ \_'l--<~)-~ -\v,·.J 't )
~:::-;l

-,
'!.~ ~ e.~~-4'-' .-\- [\\.._;_;:, +~

t~" "'~ ~ ..._~ l-... . . . ..... " - " ' )

~,j

(:-\ \i l"-~-:~ l ) ;

;. t-"-"\ ~-\' V-) ) \-1"\ "C \\"TV -':! ~\) ~ , - \-- "\"'t ~~v ~ \ )

;- c- ~~·-i)
Li

~ c. \l. -;
....
L_
)/-;J
~~-0
"' ~- "'-t ~~v - )
) ;... l ~~:' ) .
\~vo~h~,Aoo. uo. \\] l~ \V'\ ~ ~ ..e...ve."')t ~ C~.:.<:>f\\."') 'b-:.~ \ oV) 1\v...c...t.c_AI)
()....,..~ ~~\ b~ C.\...o<)J'\\v-~ t<)'JI. 1- aJ') \ AA. .l.""~ . T\..- '-'~
t_A.II/Q... "- 1\LJ\,j'~c..e. ~ ~v-.l.<l...,'-')..(."'.1( t(.v~.. ~1H ...\vt~ 1.'\~ ~v... \,lt.6\t\'A)
\ f\......C..C<Zt-1\ t...... <..c:..c.t.... .>..."~ --.1. 1e. \. .... ) ~ .

R.e.~t"'u }-~. ; ?\u\..~"' h~, V) \~"".,.._J, e.--~"'J, ~-" 2. tV) R.\A.._<dc~)


-\: ~ \. W"'-~"" t.,,>- ~\f) \0"'"'~ tV"~-\)_.. h~ \ .i~ ~ v..-dC4A)

> ~ l \"'-\\..' ~ 4 _;,JI{,v <A l vtC.(.-\..(. ~ U"-t \) -h... 1\ \.{ cc o."' )


--\' ~l\.""-\l. \ (\V..C.C:~AA(/\ \.,((.~.( \
~ c~-'( \~ ~a...(J"'ve. }
'-1-~
',- ~~ "'~~ ) ~-i \ t\ \ ~ \ )L ~~ )l . )..\
\. . . ..1 \"-it
r\.L- ... .l.
'L

"'-;. ~. -
' 'l-- ' • , -, ""'
I
G-
~
....J

,, ,,
>< ,...._
r,.
-;:" ~
v-
>-
} r ?1
c
- rr
..-..:::J

-
"<
D

~....
"j,

--
""'r'
~
\.c)

(
r:,
- f
I- J
r -o
p 6
if
~
1
r:!
I

c> ~
0
V"~
-.:;

s
CJ
l
\I
................$
i"
__..,.
,. \I
"".J
1./

- "<
~ -r
-<:::J

y ~
}
.._,
"J
_;....
_...
.....
i'
--
<:'"
._
,..
-.,..:::>
11
s:-;
__.,
0
..__/
I~'
)r r
~
:>
!.
,.
?
.~
'l ,, " 0
,.
> ()
~
f
5
v
~
<
---r ~~?
s ,-,
~

..-- 1-"
~
'r' Y
1/\

~ ,,'[_ c::r-
~

\' 'I
.../

'1
t

~s-15 ~f1~
-
;?
-:I
}-

~
<
- ,, J-
y
}' _,
t"
fO

~
r ~

,,
\ I

f,"M 1 u
'-'
:;-
(>
'ltV) ,.~{\1"" ..> ~ ft f 0
l I

f:r \s-
-
rJ lr -a
1;= \?
\ -
~ \ ~
\r '( ~ IT
rf'-'11.\-" ..-- ;$?

.., r/J
f
IT
I '--"
._./
,~ b J j ~ ~ J--~ d- ? -
.;.._ J ,,. r :3-r.~
~ -"G-o 7
~r
9-- (} ..-

f s
, J:>
~y 7 l ?,. ?,... "
1..1')
~ V" ~ l t cJ \p 1S J .1 y .._, -:J
~
>-~
,..--
~ 'C.,..-

-"
....
.... d'"" ,. t )- F)-
j
f
I
~
.._,..-;r!.

-- L
}
-
..,
z.,....
~
.../
"""7
,.Y'
-.,,. /\- .r-r
:r t:..\5> ~
::.... .).
rl~
«"
5).
t ~
~
<. ?'
J
,.-.
,.........
,. . . . ,';
IT"
'Y
~ 1
r ~
~ !j
.\

-c
>'
,? )?
.{ "?
... ,.. -
-o r
~
~r
-. ~
~
I(' }. r }- r)5=> "]: -P
~9
\ 0
,..--..
t ,f
c:r1 ~
'-

' ~ ~ , z ).P
l ><:. T r' ;:>
,} ;
t.
< ,... ,9 --1- f v
~ ):f
11
,..... Vl ~\~
,, ~1(
lr
,. lf ~ f ..,..
o Z\~
<::r , .... ~ ..-
to'

~ ~ -
v IT ~ ..r-..../ \
y \\ t. ,--- 'ci -,,
~

_r ~ -
'.3:'
_,
...
<
c
~
Ei
.......
,,
-.:::' ~)9
l_,
'--"

~'~
\$
r
-'
..\: ,--
v-
\

.\
....
1
'"
s
.._,
~~~
.._/

...__.., I ~ 1
{t:.

....._..,.
!: '>'-
,J
t
~
~
f \
,.. !;1? -
))
-
..<)

?
r._,
~'r ':f ...,
\I
d r'l>' ~
) ')
\
/
_,
-- r
r
~ v)r- ~ \v ~
lr
71
-?I 7\
~
......
~" 1="1 j c-
~ ,--,
L \,....
~,.,.

J1 v~'f'
el- V"l
t
( ,·

--- ~ ~~
ifL
'---"'
v
'!
.l
Wvo ~ le'"" \-A
\....\<:>. '"' \

)(~Po-\ I:..J.ll"' v...~a""'} ~ '-' ':>~11.1\ Q..V"~.~>"' \v~


Ye...\ o1
£! ~a.A\.. ~ h~l") ~vt4....:>11\ \"ov, "-'"~\J~ 4"'<: v~
\\.--- "" N ~")) Ly ~ ~ \-.\,) o. . . .l

......
::. I. 'V\.'Z\)l?-)..) h~-r~)

""""'"' \ y..) '( \ J


t: ~\ ~ \ "j.. ';.)..) ~\..>(?}..'
)..~\.-A).\ 0..) )\-\-l\..,_-V", 1
"(')..\.Yl...-\.\ ~ ')._.¥'"'").. "'"': 1.\)..-\)

\. ~\1........ Y\.) t 't--'-1-k '"'- \. )

\C'-"1
f\~ \: \ \.A)~
vl. \... ).._v\- ~ ~J'. )k~l>£\.X--\ '\ )_)

~ ~ t}..-<t\' ') ~ ~ \ .L) ~ '). <- A-\ . - ,. - c_~ l

\~ \. ~~ \ \ <- ~ )L \ } - ) "?)
~-, J\-\ . - .,. , ~ )
~~

~x.lv) < t)l.(\) <


-- . <::. ~)ll~-L} L ~ )( l ~-\) ~ tJL (.~) ')
::1""-. -h-- (I) CAl\(. -ht v(! 0. vc:_ -4-~1> '-~ <l • - .
~"'~ "-c:\6-a >.. ~ ;l (;- ~ ~, ~ ~ >.-\ !4." ~ lliJ ,\ -:,, J tk.-t

htv(.

\ ~ \'<:!) '- ~)I. L\ ) <. - • -


{_~ll. \ '1\-\ ) <... ~ lt \ l \ ) ) t~ \ r\-t \) I ~ .)t \ M:t l..) ) . - .
~ "c:\ k . _ n -r t .>.) .
Ovt>~lt . . . No. \~}
>
{fA )

~\- ~
;::.. \ t;, e .
""---'~"'> '""k). .,
~
lA\e:-~
./
"' l-
~ 'p..
Ln <.; :: \ v~ \t- \
~: tl- h)
ll: l """'
t y~ \-\. l y~ '9. -.ll-
v:'t e- \_ v-'<: ~--'- )
li ~ e-tv-\\

e_- ~ }-.\'1.. l \- }_ )v
\1
~

_}.
e.

(b 1 l_d lJ) l~~f \1..\t"-"-\--\ CJ\ ...-'H. 1f"C'U..~ '=>') -._ ~C!.Iu'\o'-1 tV') 1\..._CC(.~')
v."'" "' .k\I'\ I L v ,la 1\.( "'") ~ ~ u-.Q.. cS\.1\&...1. L~" ( .
.t
\..._LJ;~.a.(!..-lt
T~ ~ ..lo.N<t c.. [\e.yL-.c.<. 9\ '-'-;...2-So-o A bev~o~( -tV\~
\o"\ ~ ~'( o~~<.~t.-\ ) 0\ f\."'-C.CU., t"\ ~ ~ ..\;v,.J. lUI V;:. () . ~J._ .

'}..-; ~ "" f\ . . ._ c.c dAJ\ u. \.A'\ ~ ~tv \0...--t..l t -\vt..J.A "' F.> I"\ l '}Jc.':- o. oo 1.-)

R~\r'W~ \vo\. ;. ~Vi-~ 1. \ ~ \- t l.ll'-;.o ') - ~ l)l;-\ ) ·


- \- 'r t ,_ (). O U "l..)l-s-'o"""t l.S"l:> ~)l l) . o~l-)t...l\- (). ()~1. l.~'"'
..- ) J
-;. \- 1 H't:;~'>-t t ('1-:;.'))
-;- \- t e.-"-\: ).~'). ')

0 . ~ s-'i '

....,..
~ £.l)l") :: ~ &..'"'
t (, ~b

G ~\ "L. ) - CE\."'\ \ ) \.. :

\.\:>' \G YL~\V~ )'v-o~~\,\~(-\1 V,.

~ ;. \) ( ~--~ \ :ll., A-~~ ';> 'l- ""''"' \ ~ 0.-)( \ )


~ yl 0..- ')( ">'l..,. I
'1- ~ ~ )~
)..
~ (X.') l-tel.-"'' '/-")
/
~)
)...

> '~ )(~ \) ~ ~ll(> ~"}

_\.lc\
-;- J -e--y_ J..~
_, -~

(\ )...<...O

)- ~-.:>
-\

\ ~) 0

"\ \. $.)... <. \.~ ~

~?-~ ~. ' .:... '"'-1


\ ').. ~"'

~ C.il'.lr.: lk ~\A "'-.,i;~O"\ ~


tlct
Ql ~ \ ; l"-~ t 1\ ~\}--
0
I

~
~

"' I
'\
'''[sr_)
Q
~ .... , ~ x (. .i ) : L: 1\X~,~-~-' /
);:.,
~
- L )/1
~\ }-\ ~)( ~ \M\"-U.'"' ~ l -1')
- .)

:::)

\~

~ ~-
) <.o
:>'\ r 1 t1\-; --\ ~ S. "':) < I ~
\ rv\l\~\)

I ~ ';1 ~\
\ ,,_.,
1'1 vo ~ ~ ""' l. ~ J T~ ~. ~ C-vv v'\ \ "»"'- .\l~ -+:- ~.c).. ~ .
""'

t'\ ~) ::
2_ """ ~ • .... \, . t

~\\l)-;

\ ~vo>U~ 1-i] \.c.}

4:) 0.

~)
/
. .. ,. .... .

. '' ~ (.C.)

~ l ~"' ''p ~ '<1- ..,.... "=- \ ~


/

rl ~\; ' ~ ~
...r ....,. " w . \

~\...., ~ \.._ )..L-. \ ?- \ ~


h-\ e;.

~~,v.. ~ ~c.k --\vv..<. oc/) rlo)>-(. .

c;\.. . . . \ l"\....l ") ~ "'-;. ~

--
\
\ ~(.,.\)\"'
V.\\....)-:. ~ l \~- -·
"" ..lr\v-tA
~\) ~

~........,
:-
'
/

~\"'-);:...o ~\L.h V'l


~ ~ ~"' )-;.. 0 .... ..,.... ~ \~ :') .,-'\_, )

~)...e ~ov
J

"'"'"" ...\,'1...-t . \t~ 'I' e:- (~ \) . W. f\; _. (\o....,. Q') 0

~0\.0 ~ ).. ~a ~
"l: 01 1. ~ ~ c... f\f-\J'A-v--.c...( ~ Y...,'\"'? \ u-. '~ 1\ t~ .... l
f'v..v-. ~~ ~\....,
...._...,...,
'Y"' -;. }. • T~.;, ,. .,_
f t,)- \ ~ ~ \ ~1.-1 y"' )
1-.A-.
~ .., ~ ..,
.s J <J> \h)~ V,L) c1,J~ ~ A,_ -\ ~ j j r~ ~ \J...,')-(\\.l..~ \~a.)-11
- . ., -~ - .., - ..., -., - ~ q \;-,.) ~v'\.' ,!. ~
., .., .,
;.. ~ j ~ \ ~ \ ci,)..,)
- ~
l J ~ ~ "~ \ J.~1.. ) ~
-,
J. lj
--:.
l ~ ~\-":};< ) 4> ( "''\ c\>--, )
1.. ~\~)-\ ?-~ ~ ).. ; ~, ..)...,.., cl...
J.

l j t l-~\>.")-\ ) i>V''"' l J.~)


\ I --:. \1

~ l l -,J~~ ) ~~ ..1..

"
\_)'..l ) - \ ?-
:: \ --t j -'" eM- )
-, ).. ~ \ ""\ 4.A 1. ?- ck..tor

-;.- \

\ . J.. ~ •
~ l~- "-'\ ~

(b) f.,v \ \k~ ~ ~(l...-~


c,. ~

\\l, ~ J..., \ -:. j


-q
~~ \~ ).L) £\J..L; ~ P··'-) j ~ P·l) ~J..l. .,. ct.. ~l.l.\ l f l. l) l.&.,)-\ l
...
- .... J{l ~l.JV-11~\~)
.o; "') - - ell\._
~ j~V"\d.-"~,?-'' ~ tl.J~ V~,.)-\) ~<-.~t..\ 4At.--;.. Q o...?
-~ J -~ ~

{)) f\j'-~"'} v.. c.~' l


~ X~.. p., \~ f O·v)
~c) ~.w- v\~vl) ~-;- t)(~ )(\-' l?. ~ ~ ~ o)
{)(-;,...;:) /

Q\:,\t~O"" 1\ \_) ~ ";- (.)(':. 1-v) r N

~\)..
\'\1-


\ ~ Vb \., Uv. '1.-"T \ \>t. ¥. ~<> ~ --\1.-.-.:A ~ "'""\~ ~,_. \ ,....., \-.\.. ( ) ~ 4.-VCI'-) 9-\~v c......-\\ "'.di ,.....,
~ '\\ ~ '\1... t_..u) W....<"C..V\od(_ '-'•w.._..J J...~fi.-\.J\\.,..Jd\)'1

~\ ""-t" +}..-~ = \. ~~ <4\""'"


~ I
). \. '"'""V-V (.- ._ \ \ ~ ...:k l o"f
~ \ ~ j.

\)r.v~\)";.

\Jo.v\~\-;.
~ ~
C....'-'\.~~)-,.. CA., ~~tr""\ ~ t O.\O.."i-t<lL.CII})v-\ cr-l-(.> -tt.\La_'"\ crt.-;..() tit'-))
~ \.
T~ t_""\_. :t \ N \-.\1...l ~~ ~1..._.., \,/ '\v_..., Q )

\j_vo\.9.e--~ @.) '\\~-:L r-J\--\\\~'1" D-~(2...... 5)/ '\( \~to ·')\.))


-~ t--\, l b . ~ S" / 5 . ~ (.)

~ \. ~ t.. ~ '- \\ \X-; l.. \ ....._ ~ l \\- (,.. '-S" ) _ ~\~ ) -;.. ~\ \. <i' 1S\.. 0 l- O·~i"~S" J
~{, ~"
~ ~\\ · ?"'>..-\ ~l·~\-( ~ · 1-l-l..- ·

L. ~) -:_ 11;" l ~ "\- ~ \. ~ )~ Jl·i..'5")-~l-· 2S)


'f... N ~\ \. S-' \{.' -:=:) ~ ~ ~ 1... X ~ ' ) \
,.. J_.~ l · 'l.-)) __ , '; l.)C · 5'19-:t ...... ,

\) ( \ (. '"\ L"\)--;. CJ; t ~ ) - ~ l ~).,.. )...\»( \ ) -\


;,
~ ')..,C.. 0 - f.~'\) _, .

l~)
\ ~vo,~e-l.-"\ \
- l ~) {'ol '= \\J ~ -': l- v ?- t "'.s. \ ;-\."' ) )(
1.- "t l ~-\1- 4-\\. )'\ t'V 1--1 1 (AI\ l~ ....')"'N.._)
~\~v G..~V...d.l,..,. ~ l ~ '\
::=.\ l l>_.~ V ) rv N \.. r-) ~ "'< {. \ ~.,.. 1 )

k:tu\; \::.\,\/)~ u_.. V~vLll)~ '\/c:;v~V) ?-lo-'--+;:~)'7-J Cov liJ v):: J


0 ....

l U,.., V) ~~ \~0) ~ O.v~ ~ . \.'.d.. ~ (~ "'fJ o'-~A .


~\
-::>\

\cd<..l'"'"'
r"V

~?-,~
\-.\1..l

* -(

L- "' a...,,
U

....-4t...
0.14 V

~.....J..~ \-~~w.l\ .

- <'t "'\. ). <:. .,

---;:=\ '1-\. ~ ~' ~~ \ )


\ ?'to~~~J :J
~ )( \:>- ) -=

,,
.
"
l\c:tv'\""....1 e\
'/. ,rJ \..\ l~_ \ )
"Q"'"-.J cal ...,_ N~\l~\)
r'L "
''"'\~'..,"a.'....,~).\~~~: 0 ( t'\ l ""'' , 1.) N N 1. \ ~ • \ \ - ( )
J --- _,

?\ )(r-1Nl~\) .
\ rV ~ l~ \ )
{) ~\1\o~ ~ ) ( :><..... \ ) rf' h\1- U.\-. ~ ~M ~ ~o.
~----
\Jt'o'> ~ ""'~ l~) St"'-Ul '1-.. '1 .....
C'-' ~ L~. . ')_, E l>t- ) ~ ~ l"'"\ \ ~~->
., ~

~\_')(\\; l\ _Jj:>-) ~ t\J-:.~)~}-..6.)


~ ~ "';) I\
£ l)l,1v)
./ E \""t '""")
\,u\.&.....1. \""\~""\ .... ).-.J~~\~... ~~~-(J

-....
?-\ G,...; \. ":. ""' ) ~ ~ ~;I(~) - £ ~ \ ~ \"-\ \ ;:.. c:::. 7) c.,..v -.I l ~..,) ;;. -o
l~) Ob"'\.:..~}..) 'f... ~ '\ C,.vc.. '-~~ \.. _ L..'f .u.--le---±- ~ .l c,.l\ ~;::.\::)

)(~-, .)..." I )(s -,. 1-) :::. ~ ~ ~?,._.,) · - • ' ~~~ >-:_
~ ~f. - l":~ l)(r:::.L)
- 0' \.2.'- l2.> ~'\U:
~

~
{).~ '1. '-\ ;:... ~b- L. .X v
(~

l~) -;h ~ A ' v-... \A(. 1-<A A. f..u.._ q \ 2.. V..... :} \ -. . . .k.\ t..,...J~ ~
t)~ (J.. y dV ~ ~~ \...(

'/... l ~ I 1 ', • , • , J\v._.._, -.\ . ~



\), :. ~ \ d~ \v..\ A. 1\v..v- e\ \ l. u,... CA. c_sv.A \ ;-
--
\
\\~tsr) ~b
=) X -k b\f.,. . , '::-
h'\_
~\
~
c . . Jn'- :l\..

-=\ , , ~
-- V\\..

1\1..-" 't'\1.. ~......


::
----
61-
:l.(·t b\,.
:::. ~
~~~
)... )..
('J ~( ~) ~2.
).."'\:.- )

::.\ '\
c),_
-;:-
-:b.(
\h\
,....;~\

'-
f\.-
'-c. ) 1....7- \
~
VJ-J..
-
'l..~l..
e
(}.VC- (. ~ . !
~
""'":)... "''--t"' \..
' \ ?- ~ c,..-A '\ \. :. .!2_ (\vc.
\ ~ .,C..(~l<'\.-

l· ~· ~ e\ l,c.'... ?( L._ .?1.> \ ~

\~)...'""') -\~) '">)


)
e._:>.,~>-t_"'\>-__..}.
- ')__ ¥'\ "'"'
)..~ ,.,...~-v....-\ ~-\
.._
"~'>
/

""'.J''-....., ~..) ( -;..\ "\..... >.


~
-
-

-;.- t.., \1\ ) 1.,1.. \. '1 \.-)


::::.\ "'\ \ o..-l._ l 2,. Ct VC \~.l.(. p~~ ·
\~'to\~'-~LJ \~ ~o'"'.\ \ ..\.\
0 \>--~ .. -_ >-..) -- t:..
/

~e.·~
~ /

"'),.,~..::: \."'-'-) \lt'l..~ "' - >lr.""~

-
2- \

C)
"' j

D.-.
"'("'
..l- 0 . ,
"I" ...-\.
~ ~
"'--"\

\\OV) ~ ~""'-"'"'t ~d\ ~ ~ b \- , , J- be... ' -A

.-
t~ .:
\ ;,
l\~ --- 'b .... ) ~ \"' e. ~ J 'J> - / 0
J ..-
"- - 1 ..._ .,}

~"' b

-.,.
v

t_ t.t~()
l "/! "' -C.: ~ \

L '-lo.v \ ~"-· )
l 'l-{ l h_.: ~ \)"'
'
c::-<

., \
b.: ) \. "1., 1\ c-.. \"-.~~t..J ~
-..... '\1 "'" \.
L \,";-( ~\
...r
\ -- \/ ....., l '/.v., ..,.)
~'- <;:_ \_h-.:.-4: I)).
)

\.1;(
\'Sv,J\'S"" .

- - - <:)
ESO 209: Probability and Statistics
2019-2020-II Semester
Assignment No. 7
1. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is unknown, and let the estimand be g(θ). In each of the follow-
ing situations, find the M.M.E. and the M.L.E.. Also verify if they are consistent
estimators of g(θ).
(a) f (x|θ) = θ(1 − θ)x−1 , if x = 1, 2, . . ., and = 0, otherwise; Θ = (0, 1); g(θ) = θ.
(b) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
(c) f (x|θ) = θ1 , if x = 1, = 1−θ 1
θ2 −1
, if x = 2, 3, . . . , θ2 , and = 0, otherwise; θ = (θ1 , θ2 );
Θ = {(z1 , z2 ) : 0 < z1 < 1, z2 ∈ {2, 3, . . .}}; g(θ) = (θ1 , θ2 ).
(d) f (x|θ) = K(θ)xθ (1 − x), if 0 ≤ x ≤ 1, and = 0, otherwise; Θ = (−1, ∞);
g(θ) = θ; here K(θ) is the normalizing factor.
(e) X1 ∼ Gamma(α, µ); θ = (α, µ); Θ = (0, ∞) × (0, ∞); g(θ) = (α, µ).

(f) f (x|θ) = (σ 2π)−1 x−1 exp(− 2σ1 2 (ln x − µ)2 ), if x > 0, and = 0, otherwise;
θ = (µ, σ); Θ = (−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(g) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = Pθ (X1 ≤ 1).
(h) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = Pθ (X1 + X2 + X3 = 0).
(i) X1 ∼ U (− 2θ , 2θ ); Θ = (0, ∞); g(θ) = (1 + θ)−1 .
µ2
(j) X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞) × (0, ∞); g(θ) = σ2
.
(k) f (x|θ) = σ −1 exp(− x−µ
σ
), if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(l) X1 ∼ U (θ1 , θ2 ); θ = (θ1 , θ2 ); Θ = {(z1 , z2 ) : −∞ < z1 < z2 < ∞}; g(θ) = (θ1 , θ2 ).
2. Suppose a randomly selected sample of size five from the distribution having p.m.f.
given in Problem 1 (a) gives the following data: x1 = 2, x2 = 7, x3 = 6, x4 = 5
and x5 = 9. Based on this data compute the m.l.e. of Pθ (X1 ≥ 4).
3. The lifetimes of a brand of a component are assumed to be exponentially distributed
with mean (in hours) θ, where θ ∈ Θ = (0, ∞) is unknown. Ten of these compo-
nents were independently put in test. The only data recorded were the number of
components that had failed in less than 100 hours versus the number that had not
failed. It was found that three had failed before 100 hours. What is the m.l.e. of θ?

1
4. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is an unknown parameter. In each of the following situations,
find the M.L.E. of θ and verify if it is a consistent estimator of θ.
(a) X1 ∼ N (θ, 1), Θ = [0, ∞). (b) X1 ∼ Bin(1, θ), Θ = [ 41 , 34 ].
5. Let X1 , . . . , Xn be a random sample from a distribution having mean µ and finite
variance σ 2 . Show that X and S 2 are unbiased estimators of µ and σ 2 , respectively.
6. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X).
(a) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(b) n ≥ 2, X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞); g(θ) = µ + σ.
(c) Same as (b) with g(θ) = σµ .
(d) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
7. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X). Also compare the m.s.e.s of δM and δU .
(a) f (x|θ) = e−(x−θ) , if x > θ, and = 0, otherwise; Θ = (−∞, ∞); g(θ) = θ.
x−µ
(b) n ≥ 2, f (x|θ) = σ1 e− σ , if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = µ.
(c) Same as (b) with g(θ) = σ.
(d) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θ.
(e) X1 ∼ U (0, θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(f) X1 ∼ N (θ, 1); Θ = (−∞, ∞); g(θ) = θ2 .

8. Let X1 , X2 be a random sample from a distribution having p.d.f. (or p.m.f.) f (·|θ),
where θ ∈ Θ is unknown, and let the estimand be g(θ). Show that given any unbiased
estimator, say δ(X), which is not permutation symmetric (i.e., Pθ (δ(X1 , X2 ) =
δ(X2 , X1 )) < 1, for some θ ∈ Θ), there exists a permutation symmetric and unbiased
estimator δU (X) which is better than δ(·). Can you extend this result to the case
when we have a random sample consisting of n (≥ 2) observations.
9. Consider a single observation X from a distribution having p.m.f. f (x|θ) = θ, if
x = −1, = (1 − θ)2 θx , if x = 0, 1, 2, . . ., and = 0, otherwise, where θ ∈ Θ = (0, 1) is
an unknown parameter. Determine all unbiased estimators of θ.

2
10. Let X1 , . . . , Xn (n ≥ 2) be a random sample from a distribution having p.d.f.

1 − (x−µ)
(
f (x|θ) = σ
e σ , if x > µ
0, otherwise,

where θ = (µ, σ) ∈ Θ = (−∞, ∞) × (0, ∞) is unknown. Let the estimand be


g(θ) = µ. Find an unbiased estimator of g(θ) which is based on the M.L.E.. Let
X(1) = min{X1 , . . . , Xn } and let T = ni=1 (Xi − X(1) ). Among the estimators of
P

µ, which are based on the M.L.E. and belong to the class D = {δα (X) : δc (X) =
X(1) − cT, c > 0}, find the estimator having the smallest m.s.e., at each parametric
point.

11. Let X1 , . . . , Xn be a random sample from U (0, θ) distribution, where θ ∈ Θ = (0, ∞)


is an unknown parameter. Of the two estimators, the M.M.E. and the M.L.E, of
θ, which one would you prefer with respect to (a) the criterion of the bias; (b) the
criterion of the m.s.e. Among the estimators of θ, which are based on the M.L.E.
and belong to the class D = {δα (X) : δc (X) = cX(n) , c > 0}, find the estimator
having the smallest m.s.e., at each parametric point.

12. Let X1 , . . . , Xn (n ≥ 2) be a random sample from U (θ − 0.5, θ + 0.5) distribution,


where θ ∈ Θ = (−∞, ∞) is an unknown parameter. Let the estimand be g(θ) = θ.
Among the estimators which are based on the M.L.E. and belong to the class D =
{δα (X) : δα (X) = α(X(n) − 0.5) + (1 − α)(X(1) + 0.5), 0 ≤ α ≤ 1}, find the estimator
having the smallest m.s.e., at each parametric point.

13. Let X1 , . . . , Xn be a random sample from the Exp(θ) distribution, where θ ∈ Θ =


(0, ∞) is an unknown parameter. Let the estimand be g(θ) = θr , for some fixed
positive integer r. Among the estimators which are based on the M.L.E. and belong
r
to the class D = {δc (X) = cX , c > 0}, find the estimator having the smallest m.s.e.
at each parametric point. Is this estimator consistent?

3
\
L t b.\,
(tC-'\
LL*' I t\ o{ I cn
F. u.-v z Lr:u : i. !.'r''- .
!-r.') : \ \ ='^ ' '
1I (: vr
I ,,r 1., 1v rr'-a) gv r "l
r!
'- I i ai oa " ' t
-' ir .}\ r..:/,
\.N{.i.ia'/ f'.*'
>-^,
'r" rrel 't\^i-1fr\4 ' ^,- i
t"u.L' - \ l +l i
h\ Lr.r
v\ lr-or). \.:-L"1S$r'
\t S:L':)' GL*r\ jz
-\-,!\- "
A\Y\\--
q- }-
L
ti.' \-1Y!!:'), -
* i]'/ G--' *:f: L
; u"--
ilfl,-.r tV.,t-9 l
1.ifl'-r :

\'^-,.r., c*- *.
qYi'L
tr *^"ca
\.\* x- ]+J
\ '*T <*\Jv
1-
t.
td\ K\U \"- \\- r\ t \.r :. ) $.',uLI \ J L'rq - flltz)
=
, Ls't^o^g

-i-- 1 i*-r... LL
\ \r1
{7^rv
L-V\^\ \ ^- S-f A, = $"'\ b - 3 4.,-r
' / ,\
gr> -, 1- A .
-Eti

billttf r-- 1X-i (_ i'.,\. 9*, I


= i-X q ," r n
c^ $rvrdft+^ .O.r*_ t*".\1\ -t {5r^nrn+.c-s
Sr*a 9+\/." ) in to^$l*t^l*^-l
-i I- - 1r-$+) )
E ig^*.,-\: - l:-gti\
\
t N-r\ i \<!
l\<r ) is*>)
!?
t^ j) '--^' r**lrr*<'-*
r --^'-?-i
a L.." !* t
r.\^ / \ \r^ ift r -<-\
L€t Ly\d,.li);. i--.\ o'u*u-i \fiX.) o()o- ;.1>n
\ fo rr.^ J t \L:r /
{-'- h
o(;. }t)s
*[ai-r) LL**r
-\d' \*1n) -i:'l' t:\
\*Ll\d.)'\): -"^'i''Lf^)
g^ tI ;a - 1* Ll \d, ;* i 2 c
i-\1 O,r\ \d-, t^')
*
oe-
!i
.--
=f {,. v1
t t'\*t, rn', [*i,r,r ] - [ [xr.!; '; >c ^.
-i \.^ t trtt- T \-
a-
) -)
I, r",ha- I*,,">a 'n T'\il .* ^ t* i t
\- l!,*tr;i>c
I -- r
'r --'Aj*rrr)
.ar a \
uq*$1-317'
4
. i d_\. i=tae '4o!-v<i
l'i". : : -! \i*:" i -J l'-' 'J..:t --a i-n'u-e'l
n

.t\ 11.'.o'oli'-.
G-ryi'l+€+tlr
ft s-,.11\1iA+
l'vout-i
' .\A , i\: |
*r-t{ + ,.1 i i'--
-' iL. '-
t-i" C(-l-. ;- i.
L^i 11 \ - .v li '/11- \- j
i+-
;
*
n N,\
., !,?- !,a* ' i !rt. St.t\ ,
Jl
J.
V\- \;',1'.o^3
1 a/ ',u
tA-
iAr-- rr

>)r\ .. x-
-1
,/

EV l*t \ cs{ (,-,jrtx.-1-w\ \u..rh.-w1 "{ | "r

*.-ttnta; '

h) \n'='l!..' !r- n. \, \sl^--! S..r r + ir,$.t.. , t"*1 =-


r_
f4
tt?\ i!.-..1\.
oItnl -)-
)

L\I'rr) -- \.
D -
\avi0,^a):-o-' '...-) tt*" )- L\- Stt',' €l- u-'- I
\o \i tt.- s.J B>*' cF C-*.^+.-i eA+ir^^-J--A ef h
o^i -L
t?) |ret; iq r.{r,,r =r-eli
-
f\.L.€ * t (^ 7 ,:t t\.L-€ "\ ?r\r L4 tnutr ): l-e-rT
I51 \,.r.L.!- u 1 -19 S 6yJ f,t*'-r 1tUr ,.,r 6. (3u-}rar'\'o'^'T {t^-.Jti+' tlr$
bi\L i'i
C-. l-
\i \; -'j3 a-'1;;*-h^) )
-\
=2 \-aC ; flr* ) =9 Lr.,,-,.L\ -, G*1"^.\cJ \ov -Q\S

Eg'rt..'r; $ So i\.t'..€ 4 Btli v\ tr.rrll)= l-< t = tr-',iLi


t{l ? st-' lg lLx.- '^ :- - {} zo j = -rt}*
r
-3?
r e i^ ? :) n.',e. { ?rst 'A ln'-\}\; I
tr,.u. 4t -'n o' G'Jr*.o*a {t^'"*ie" "t
b ) u.L L.., T l- €'"*r'! > $ 0''^n Ai*'\ ?tL \
_! ti\ ti G'"t'r':-\ \'r Sr\\ '
V =) 9*.-\1 ', {-u-^ ?rs\ = Srr.
h'n'E' t lisi LA f,nr.'.l' 1\ = €tj - Lt'"li '' '

Bgt*.) 2 s . $o
\i) trxlu\: i t-,
I
tf b) 2T
, \F{^o--r -1-= 1p-c'. L'iYr\., - - '- l*-l )
O' \'i
I
Sc 1r.'- -€ . v{. g \-,t\ 1-T -:) \-.- L- E- "1,, lrs ,
) tr-t s 1-\ .n ln' t- I U 111 zr ) .

Fr:,. L>o Jits*


i\i=---*i>e)' iit-r> t \_,,, ?il. +-+)
- lir-sr: Lt o{€:fS
L- i3)'
D- 6j l: $ D
j-
-)
h--1
"' Tnwr zl =; in.t!'r tr**r-t l,,tr".* Jt$\:
r:x \i: rr'r c2, Gq}\.^qr.1^ \.,r'^a"v^ lror $>-)
I\'r-a* !u \--i-iZ v,, G*i.j#
L\1-*.\; o S' r-'-!r'-l d LTrr--eJ \'" 'q"\11-''li'> \*'U
b-*. €* t*,r -i > +- !o '*o*\\,.' l'',g-= $=1 ',. ti,i*-{,^j f*r,*,
A''^............---...i=.^
N-; !: >i $:"lt:-Ar. -:o \*'c'.\r--i f\.\\-t \trr\ll1
ir- r> ft
i i* ; ',." - -)-' -t-,i .\ .- '^ i r: i; '- --' '- "-' \,- 4
1
\1

i ;- i : T-
-4 \t.'-,\t; f* .u,.='r
' .*-r;i ;" r-.-drr\r:j 'n"\' t !r,,'',.\Lt " (^'*rr^le-i
-i J'
\L) -*. [-n^],.-^ ii*i-')
\?r --; . tA 1*trr a-)a
\\u..-'l= ["Ldth,'a')= ]
\-'
\ u o.L.j-

-a L /t,r o-7 V- ilI*r1\


) ^)$-

^7s

L LI! - ^L\'\ I /
\zt

L\a [*1-:q',.) v]'r9*tr\-


L. \\-\ t fx 1x,., -- ) t L1 ( *t. ! r-l^
- a-)o
\) t r, r r_ilx-*.,,.)
-t^ i'.i"
-=\, T\.L.s. vr !n.\ !\: t l.n"L{ \, t"."\L\ \aL'\,
"\ U--l
?r>t-:o- -T,:
\ r^L
'i ,' ti {...-j^\.)?--l: i\}iri).,*l-ti): tr \) 'i:';:r^-tf)= e = ----?o 3'l *--1-
'$ L '-\\i L:\

T.; *t.l lrH- S- !rn,- v\ Gxat;\)qr^i \-"owiu,*Ji-; l^'


trr..\I-\:
'w'L'u' Y\!
ilt-o.. ,{x.,r, r !r.-t - . Ss b;r t'' il,' i. t 1 --J-1 ii,l -
__ c\-_T1, r-,\-
i1 r - _;i *{
{;,,_r -v'.r
^L\ i -L i-i-- n:.-

r]r *ir +e--'i q-


=) t:-nl {,=
'f \"t
Lt!:'-\ :- Ht -- Lf l+'''- ) : \tno -i-- -' '

io F\';1.8. Snrn\!t.'l ( i,-o..\l\, tr**.ia-\ I !\ #t*-* "o2:


t
a'brlr>., .^ \ -I
^Nr
= - !r*n , Nr- -- \Sr*11t .ttur.-; + "\r-nr*

\r^ f- 1r"s
!
X ---t p\-\'-- --\ \..nr^ -5 h- o - *I7 - }r.
.)P.11
{ -fr V lr.\'- o-\
d ,* r. -f- c- t rnx CJ \t""o- d Gt'\'l i )"*J \t" A a*\
=
c- ) -va1\z-.htvd; .

\ . , t\ Srt ! {ct) G^A b'''>' *1^;


$\ L1 r\ $, 1:
f
tn---,t'
) D.Lr
\ o
C'\'e'^"lr h.u.g.o\ trn,, [\.\,] ,,r Srn'lxll ltln'\Y-], Srn'-LIi):
I d..) dr,^. ]
fir ?to -li-
{'
lot \*r,r -s., \ >€- ) l' !J n' ir )'\r-r r \ = n\)r ?\r",'r\.r-.\-
!r
C b- .{ t)- 0r-$ r
0
\ \r-s- \' rr^
$r-!, ) - '\ !)<e(\-
Sr Sir"r\ I ):- Xr, \ vt r!...;r.rl+<^$ lo--7 --- ,_ G.;1 ;t- -r-

i\
tL '.r t i1^',, -Sri; >t_ ') = iq \ t"-/t*1:> i): rlL*c")<s-t-) =T l)$rr*,.<
\._e \
L:l
; - r -L !
'

-ii
9"ro-! \ ): \^r \^ Lx;.iA*<-J, bt $L.
Etxt>
u' i;
911\1
'-'-:- / L!,.1I ' $l- ui-s sr$:- {0..S.
L
\-0,\,
? a \-'

T^-r ,\,r.e \-t:n.E.


trn'."', r tlr.n '-
\btrrn'r !:lrl\
\ )' -i1t..ttrln
>. Fr = A,.-
")-

>\.6 ]- . a\
\.in r: - ri > ',l1\1-i51 1

Srrrtr =
l\
F\-+ t---
t)\Fr-Fr-.1 =

i4 I'r=bQJ.* }. i

i"-t {^orva
\sc
e_ 'Y

.*t.t.d rg
b '.i^. S'? . s\ + ''^ ^-
n -\cl
-) v-
16 LC'-t
-l J
..
-v
t'("

vE\!-l-v- tr {.a r Ll tt ): {.rrT )''L $? -c

, t .N \-
.n
tn Lt \!
\ -
J ' '. - t-g>-o
,l {-
;
.* 5- I -v,
4 Lx tB Ltr
zo

{ .1
" ! t'i
{

L>r-,
,t1 i
r b*!t a+i- i,., '1,^4ii'-ia€-l ^A il.--,r
4_V-U.

i aAO t1. \
/\I It -
.t
i -, lL'r
'-x i'- t f To aJ in \rnq.hi.qi ?.rl J $: g.-.

E*, -lL{ h.r-.s- 4 g iLe-{$:ioa}\r .vl S ni ti, i: 1'{^":{ i L t;

^ T *'' ',r- ! x;
t Li-\i i itj. !* { 2H-
)

Lx tg t-- L,^ L"1$ )


= t'^Ll - i}.r*'i 1.. t r 4 :,
+ i "- Lr; i ** !- t-L{ \
$

13
$- !" i$\- ,\] t--\
L-7
t' .'?
.i.
ll*t .-x iS J i i-l,, : 'Jn v l. n Lu, j x

\-j^j'{ i v: rr La- 1

LeaAC\) r(x <.J-


\,t \''
- *
--\ ',.r^r\t t-r{ L e.-\
Lr\$) ^3
*il2

L.A(\jf A jr t/l\
3 L'c" \{'-i
!- i-S. \ '.n ?-t-i *
LA

T.*. -}* n'L. e. -t, S


\

\nu1!\: \ J,. )
d
\: ,-
Y!.I
1
/-
h-.-
ttsJ

1
1)1 \x.L.1". sr ? J-r. p =3r' lY{xi ? (+
\e { ossa)"r
1$,
t- 1/.r
{ +ro'=/, { A,n*
\ .[ '- ).,,,1
,
.'. -.! ? li
\\$i +d^ Lr.pT\ a< q vv) 4u**ti: - }t .\!
i:*-*1 !n,,iL,' :--; L ?CI;
";
" '- i-r G.^ar4;s-3
t,\.:4_\l :.r \
1

\?-
=L'1 +/ d*, I
o-".+i,.***ii"^; S {t+€
- \ q- -^ ' l

Probt{-*S tr-i t-- \l,r --L\.


\
'l
r i\ l{
<-a
, yi t\ -i. (
];ia-y
a + vd i
/ / t^ L2r
L
\= r- r .L t
uOLlr t= VsLx.i< I,ugt{.)')
)
."* i
o-"- I /i

*9 t
€^ \
J-
h-r
ilr,-u, Lzl ),t
1r\ {-
vl*
t
r
j- :Ly.l. -. -Lx +)

!- r
\- .Jg 'tj- i > '\ E- .-L - ?^'u
'Js, 1l \.-, t^L f, 'v'\' \?1 ,.^
.'-i' f- r-i\'
>i ++ ix ) t- ''s . * - ;Ysi>'' ;
-E*l -1 r:^
L! L
Lr 1 >-
1- v--- ,'"4
)?
-
i-:
\4$\5, r- 1
i -- -t

-v :t
fTr-:------- L's_ ^l
\ A rs1 tr' \t' i.!):
or-
!\ r ':--, -)
!fveb$,t.^^( 1 fat
i:- ', L.€2 {.t q tl
t\. So \-a' v\.
\r
5-v Lft'^ua! 1 b'i
\Fi^ r.^{ \ -- a\i

an
'r
Eu\T" I : \rir-r l' I 'n+Y-) ) V

-lL\ r,
v\
b. t :rr-rl- ' \-'**v-'. ) i

t.,r\L\; -__L
"-,,^
\a1n-r)" lx'rvr')
l' ) !u tr' u'€' "i |igi -r

\bt fr.u.E, 4 \X,--') .^ LT , 1. t,\rt-l


i 5
l* \! t> X+ T luStral l= \\i LLli-tJ -.L

t ^,*12 \-t
\
-)_
;.
J / "n-t -) rst L\ ' A
/-tt- -
\-\ \ \^-)

,IG
-\ Ls\- - -\ \-
.G f-o- I

i + \- ..?
>) t"\l\-- ll

.i 1-
(;l'u
Lc) t\.u,e,\ b * "q. Soi,r\.u.€. d 0rrl -l, t*.\-I) : e_T=
\-ri..,.-r T-- T l-' o.r ]ltM-,* \.t^\) -
"+
7\

\,rrt v'eeA -}- \i-J' [tf t (- a \',r-*]rir* o\ T tv <-1^^'v'lt't\Q1 t\ T ) h'\''


\l$>o
'-^ tt^i-ri) = ot
1,-\j':
u1 -(
{t -
\ i1

A l- 1"i\ ) ? \'n+)
*-.....-.._...:....'_
-
o\
v V t?s
2t-'
/b \. = -
)z-
v1 . 1.rar ) t
I '\ \ h
L n:r:-
gi
a
+ w)i u)
A! .t
)'t- 0 V $ -;rc
)--> a\
j^-i );-. /\c-rt<a \s'a L'+r'S 0'J 'R'N'S') U^'$U^ y $)$- -Vt!0
Sr** \'wo^
At-n'cl cFe A<*4- L' A -
LJ$rvrc^). I "\ $j i^^ ')ur' \ou<"
)^ril'nj 2 q.)j ., r.a-!L.-
a l x\.
,}zll
2\ l^r.\ I: 't
"- t)\ ' air-i-t^J
; L''?;
- {lr-<-t-
,i^\I / it-ti1 h /
=a

!vL1 \ t ltt !- -.^.7


)
-\
!"\1-\t aXf
i
i(Ai y,..\-.t.{ !rs)--t +
\ r- A
f i, i )v

V tLi-a-)
Eq\*i'l--^) -3-
Fz.-1v.-\r $oL v-$&l-T-a-r -\
-?\ l./al\'r I / - A t

to t1i - i ir':- !'^ L


=r \ - I

nt^\Y\ - *ru\Y) a' n /


)dt)<t-
t a i etti

- +LLs\{"'-*)."i 'i-o\,\
I .. \ 'Xn** !.r.
\',,\ b'Jtr\v
T*, i'- r*u.,^" o\ n'y E'
r !u Fr t2' +
\ \' v' ^.
'\' \tt'''- i . F .\
L'Ltrt-!t"'r l S. J^c
U) l\.L'e' 4 9; \\-)
lgr \'
ti} r ..A tn\Y \ > ^ (\- rJ,.r^lr; t At'
tr" il12-'^
-x^4

1L- /tt I 0^^'\ T d-'!


tr-ir -\ -- t t*t- 11'1 )
\''\ <i \
T\rd \{Ls L*a- -'
'1 'rr&f'
\'F'
F.bh* \ {
t \\ a->
f--t ;-lr) t'\ )) x
\r,,',\"r\t \ n 0,\i'
\ b.

\1r*\.- *>+

-l
u-/

L^
rtsr.\!\- l\!r1t \ . Ly\ \*
-'-\ l-uu \c,-"-) )

tf'- t-,') E\[1") -) r- \eslrtll\ ;^)


\ * r\-r
-2n+ \\.^-r\o-t + YJ ^h
Vao \'^-t )l In-r )
\ "r
_1-n^, o--_\_-: (O S- h b<.!'\t-r 1**
\r*rg,r )<,a v'' A' ' ovtjx:'lr
g
\\6^ ) \

o\
t).\ tr. u. e, d\ |rs) I Er tll; \f _
4-
I

\^ t . F{^d {z Zi,' ^' A.**o t1 t>r


S )

qe€
L ^\a
''!
= l-'t-
rt S-r i-: ::lt €-u \a*): L *
-?t ts\l\ > \0.-\!\ :- 3"
!e i Tr.u. C' dr g i'n X\^r - So

i t't- i - ( Y-
'
'1l{1*1 \lt .'' a-^ v
1
Io
^r'\-'^.$t-
rzgL},,-r ) C /
V
:-\ srr\.L); Y d\-J S^\\
*\^\ -u
Yr--.
) i
v _Lr"
\t 'f .L
1 en |t*
l\ 1.r\\ \ - Yrlu\\ \; Et L $r*r
1-v\ 2bt\\-1*l €.u\{]r )
1 )-'
F. \.n.rv\
\\- \.-- 1trY\11^1
) -
\a
.n \L
-! ' Y\ f" - zB'I.r- { )
\.-\tt*vt----' ) -tt '

= 'it- ) Y.ni-t
'^"
'i r'f
^v' .r.^YlY N-
)-Yb ?
'-+ -'-111{ 4 v. \-l -" '
1 --Y
-r-, *_v) >a { h?Y, =c- :
: )----a;;;f,1
- ' ) {'".--
Lo
\r.,\.1 -\\ J I

cbr\ 5xAs"r "w'' i\t'


r'' -,, .n LY2 \r. -\
\f,NEe
jit'aF -' 1
!{a r\' '(-
. -- x
!z-o\vr \.ru Aqv*{ '
\\*r
i \'t--f'/ , \ ou. v;
c ^. C"-'r ! ': l^-' o <'\.
- \o' h? Y Dr ''
O^^ I \.u
bnH* )r-* C*
-\--i : v \-:-
/a
\
I q.*\'ri\ c\,\^ '

r\\ \1.r-,c-'\ _i e.
c $ T. So tr..'_€. d\o'

l- L -r -,e,
-ur-
i_
v s
\. S- -= -L
Zs\'-rL--n'
-\) \'
t^\at-

=-\ tv\ llr J?-


\/\

s\- . r \- €v\t{.-s-) ) - -
t . -'l-'-1

' bn'\\ \
\\co{' lB \-
\
! L*\x-'s'1
J- }- )o
- V1
qq J'ozr -\A'-0 \A' A'€' ' Cvl{<'ri- -
Su u\ \a;{u. l'L"- tr^
rf g i.*y*-)-r t iX:--*r)\ Tl- t';\'/'rtr"): t''t\{'-x-)'v}}1'
D"\r'"e
lu\r.-/'-)> >u\.
\-Yi- t') 4 l*:' tr') :\ Zs l) Lrt-:'r 1 t
I,' r I, -rt, G Too-i'-'- ;\'q'"^\r!-e =+
\f !€-'t8'
L s Lb\'. l,) ') > ?\))-
T*-
ZU i!r\;, t,r ) ; fl\)l- \ q L€ '

Nlr"
.\ rF
VO [lr;\x.-t") ) =

.C \$ Lt\J!'u ) ) -t vu \\\xu. /., ) )+ 2 ( ilrt, i-l ) r/s i\11'-r' )

\
v sao i\** t, i,- rr. ) A I Y,- x. i
Ul LL'r/:',, ) ,
1

J v-s I bl11r,)) ;vsltlx"-{') ) -


d!{o
N{{ \-=,o
+*q 4r*.olv-r
vuq+' , F t\\

)!$ r
i
C\{,vl-
or^'^!, 9r$,)
v\"-
g \r"_x-l - ?tq I \> \

---=.-- I

'.i
dtt\\l.Y'i d';q 151;'-*'t

, \) !^
n'ut-'J'

bh* 1 6 \l f,rrn L\"\to.AcJ en+'*-jf=, S\ S ttt


hlblxrl 7 S- + 0 e to-t)
a . ! A " '-.
tp\)a tttriEl
1

C-) Lt-i)$-l 7 B- V sJ kl-:l /

:t> o
etL
g=) L $trru\"; [t-;\-rr)$ LFtf- V 04\ 4 \
1:o
1l*., b" \tJ \ e€. \r 5!r9r :
uA trni rrrrrrrr ?:--<--( J
r'-'
L.
Lq \ {1r1-M1 Lt, r T'a v-{ '- \ t

.-\
-taS rT
I' I -*1^-t \ n ''

i\^L A- ,-'r !''=-- d-{ o-r"-i €^'\ 0


a

rl' r'r'ri*ita'?<-i J c-,> L, = r ilc.r


1T rr ' Q-Jvc/\f e?€
\o' _ ,
\-\n
. . \t-. r\\_

? t.-\ )-'' t {tt \ -


r
\Y\L
it*-*1',1
l7
)
.NAt\t\gr ^ r-.ice fn
\r^.' I \\
>' -]t-a
w\r'-'\r- ^.rr\e-J
,^r
.)e$ \.rr* gN>'\*'{-^ '
\- t
--$^c U-o^n I
'\A tr''t\I): i'1^)
r-i --------1, S
\fv".r[,'-tt 1 tts v.?. '\ \ \ Bn""\{)? 2T \ N-.L Lt \\> g l
.'."t {

$ U$ t Y'':\ -> h{ }- {oe,t x'> J-Jr4l^l g"^ F


\--;
^- zv\x1^1 )
t

\1
'\l \ni C; b.l '> C2 Aot .
Tv, , \* S"8 @, f t"\\\ "r\'-i Z<J' J ra+.i
"-ttoa1
\t-':lpM \t" c{e- . \g-t 1 )
'- '9! xr- r .Q^-a -Yr
T,..-,1 11.'^^r *\ -)t*0 E-l P
o

\lv €-aLL. I a\t!


il. i,ll \ \a,r.
i\,\-,^.qL4i ^
tr,C -

i -?r*ute'- ir F.,r$c-(9 ft"-. \0" cta11t1-


I I'}
t\.olL1\ \ - ?s t {i^l-t'\ - i'u'r'\ {'Xt'i1 } -
Lb
I

tx\nr-*t.,t-l\ { Xtr)-tr--o )t I
"s\e \f I - }"' Lg\ [{'1r\- {1r\-r) tl"''r'!-til
: o{t Zg \ ',3r,.r - )Li-rr-

T gs \ (-{.',-t"-e )LJ
\1trr\t )-- ao( Eu \ L*t^r -ao)- rf lt ) Ls\ ({t.r-*tr'-\l
LX69r-t}l
*-
?\ t\cD"' ls\a L€s\l-d1*l-iL,l-t)tl )o'
?*u .n
\^ \,".,rirr.\r^r
.}rgJ J
T\ \q4 1 .. - !rtr.: L k6. il1*\QX
{row 1'
,ln\
-1
?e { i i- Xrui-'t xc,\ ) L dir rt a-o ' )
C. 1
Lu\ LXt*r-/s-.t-\ 'L) -.)
\r\n-r \ t1- l
. -!A- L
)
(*'-L
1
\u

\-E
n\n-r ) l)-t )t-t A: ol:r

t-i 4) d1 4q<
LL
-

- \^t,r-.\ A\ 1t-+)h-t;n*-'
-t \ { }+\ -e
O4A<l4i '!-'1. x *)
\^1 r^-r- \) I\ll \,.
i1
x'r- -.' /I
+ )

Eu )'l h\u'r-r ) rS11h-, i


O 4 A('l< t

T"^ \." \rr-r u e-6 h!*lo, tA \,,a\*r'-t)*-l J

C= '-
*1,*1+ *11
_\
/| Fq^.*t -V^-l qr\-l-,.'*s\on.,) \-\ *4 Cl-a,-l I \oio l'x\ : 1
1^"^ 4r(
A,^Jl,Utl \A,^.q_ a.* R-s<-- \^uo*J^.-. \o,^+ -
T-l- .T
L-
i;i
r-rK h-.^

Fix f;?s F"- l


)--)
' bc'\s\:
rr" zn\ Lqc.Ll)- il )"^: = Luitct--s-
1v
: L\ E$t;t- ) -lcu- LvLT- ) -t$-
.> rr. r\\r ^,+ f- €o.(J2') -2S'Evt.lt)
F-i\i_\-i- -E!"
?c
-L
d i \a-\\t
q\--
\ - 2-\Z.t
--
<tt! 32') )o
-) c"

i\tro.\\) i \'rn\*\qra<{ 'v\


1l*, \o" \rn-* \..-@,
t-
u/ -
g- L$LJ") - n.f S- EuL\")
l\

E * Lfa"
Zs L?-' ;

_Y iw
r;-- .t'
t--' $
-''\.t'Y'
i-v
\*r L'l S
r \gr
-r*r",
!\ .,\
L
1
, ^.,-..:
!-' '/ '
,- 1*a
,tt \.jvrs -r V/\
_ \'^<
a^ "
L=vl 'o*> ;4\d,*J

-- .Y (-
r-1' -n
tl
'/9j
!.*, l-* Lfur,.,. U+ € n1\\i 1\ \J \i- .i'*i e<-\ *J - {-i l
^TL_/

o.\)\u-5;n l4-\1- r

->;, -}*r-* ^i -Yr {,,\ii-'o}-" 1

^y\
\r {--:- n \,ra.A.r--
- l\Tr/ ll X \_ v,.d ;a.r 4,"^-$&;\-
'\'1
Y-
\ h"l' t \ h<Lv
1t
{'2a* \}Gv<'.*4J'
\r' \-r**.
)
^!

\n}Y '\f- \^--


\\n<lv-r)-'' (r^-v ) \,. T I \r-r9' 1 -'i, r"')*1
\tn1 :-o-r

- \ c.J\ V\-?--
)Lru..l
\b=, \') u, L, u. rn T= tg -
\- RJ . t, t .r \
'J, (-Yn'1xJ
{' r<\
\t-\, ) ---'-,
)
v
-\ o\-D'\i /
ESO 209: Probability and Statistics
2019-2020-II Semester
Assignment No. 7
1. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is unknown, and let the estimand be g(θ). In each of the follow-
ing situations, find the M.M.E. and the M.L.E.. Also verify if they are consistent
estimators of g(θ).
(a) f (x|θ) = θ(1 − θ)x−1 , if x = 1, 2, . . ., and = 0, otherwise; Θ = (0, 1); g(θ) = θ.
(b) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
(c) f (x|θ) = θ1 , if x = 1, = 1−θ 1
θ2 −1
, if x = 2, 3, . . . , θ2 , and = 0, otherwise; θ = (θ1 , θ2 );
Θ = {(z1 , z2 ) : 0 < z1 < 1, z2 ∈ {2, 3, . . .}}; g(θ) = (θ1 , θ2 ).
(d) f (x|θ) = K(θ)xθ (1 − x), if 0 ≤ x ≤ 1, and = 0, otherwise; Θ = (−1, ∞);
g(θ) = θ; here K(θ) is the normalizing factor.
(e) X1 ∼ Gamma(α, µ); θ = (α, µ); Θ = (0, ∞) × (0, ∞); g(θ) = (α, µ).

(f) f (x|θ) = (σ 2π)−1 x−1 exp(− 2σ1 2 (ln x − µ)2 ), if x > 0, and = 0, otherwise;
θ = (µ, σ); Θ = (−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(g) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = Pθ (X1 ≤ 1).
(h) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = Pθ (X1 + X2 + X3 = 0).
(i) X1 ∼ U (− 2θ , 2θ ); Θ = (0, ∞); g(θ) = (1 + θ)−1 .
µ2
(j) X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞) × (0, ∞); g(θ) = σ2
.
(k) f (x|θ) = σ −1 exp(− x−µ
σ
), if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = (µ, σ).
(l) X1 ∼ U (θ1 , θ2 ); θ = (θ1 , θ2 ); Θ = {(z1 , z2 ) : −∞ < z1 < z2 < ∞}; g(θ) = (θ1 , θ2 ).
2. Suppose a randomly selected sample of size five from the distribution having p.m.f.
given in Problem 1 (a) gives the following data: x1 = 2, x2 = 7, x3 = 6, x4 = 5
and x5 = 9. Based on this data compute the m.l.e. of Pθ (X1 ≥ 4).
3. The lifetimes of a brand of a component are assumed to be exponentially distributed
with mean (in hours) θ, where θ ∈ Θ = (0, ∞) is unknown. Ten of these compo-
nents were independently put in test. The only data recorded were the number of
components that had failed in less than 100 hours versus the number that had not
failed. It was found that three had failed before 100 hours. What is the m.l.e. of θ?

1
4. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (·|θ), where θ ∈ Θ is an unknown parameter. In each of the following situations,
find the M.L.E. of θ and verify if it is a consistent estimator of θ.
(a) X1 ∼ N (θ, 1), Θ = [0, ∞). (b) X1 ∼ Bin(1, θ), Θ = [ 41 , 34 ].
5. Let X1 , . . . , Xn be a random sample from a distribution having mean µ and finite
variance σ 2 . Show that X and S 2 are unbiased estimators of µ and σ 2 , respectively.
6. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X).
(a) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(b) n ≥ 2, X1 ∼ N (µ, σ 2 ); θ = (µ, σ 2 ); Θ = (−∞, ∞); g(θ) = µ + σ.
(c) Same as (b) with g(θ) = σµ .
(d) X1 ∼ Poisson(θ); Θ = (0, ∞); g(θ) = eθ .
7. Let X1 , . . . , Xn be a random sample from a distribution having p.d.f. (or p.m.f.)
f (x|θ), where θ ∈ Θ is unknown, and let g(θ) be the estimand. In each of the
following situations, find the M.L.E., say δM (X), and the unbiased estimator based
on the M.L.E., say δU (X). Also compare the m.s.e.s of δM and δU .
(a) f (x|θ) = e−(x−θ) , if x > θ, and = 0, otherwise; Θ = (−∞, ∞); g(θ) = θ.
x−µ
(b) n ≥ 2, f (x|θ) = σ1 e− σ , if x > µ, and = 0, otherwise; θ = (µ, σ); Θ =
(−∞, ∞) × (0, ∞); g(θ) = µ.
(c) Same as (b) with g(θ) = σ.
(d) X1 ∼ Exp(θ); Θ = (0, ∞); g(θ) = θ.
(e) X1 ∼ U (0, θ); Θ = (0, ∞); g(θ) = θr , for some known positive integer r.
(f) X1 ∼ N (θ, 1); Θ = (−∞, ∞); g(θ) = θ2 .

8. Let X1 , X2 be a random sample from a distribution having p.d.f. (or p.m.f.) f (·|θ),
where θ ∈ Θ is unknown, and let the estimand be g(θ). Show that given any unbiased
estimator, say δ(X), which is not permutation symmetric (i.e., Pθ (δ(X1 , X2 ) =
δ(X2 , X1 )) < 1, for some θ ∈ Θ), there exists a permutation symmetric and unbiased
estimator δU (X) which is better than δ(·). Can you extend this result to the case
when we have a random sample consisting of n (≥ 2) observations.
9. Consider a single observation X from a distribution having p.m.f. f (x|θ) = θ, if
x = −1, = (1 − θ)2 θx , if x = 0, 1, 2, . . ., and = 0, otherwise, where θ ∈ Θ = (0, 1) is
an unknown parameter. Determine all unbiased estimators of θ.

2
10. Let X1 , . . . , Xn (n ≥ 2) be a random sample from a distribution having p.d.f.

1 − (x−µ)
(
f (x|θ) = σ
e σ , if x > µ
0, otherwise,

where θ = (µ, σ) ∈ Θ = (−∞, ∞) × (0, ∞) is unknown. Let the estimand be


g(θ) = µ. Find an unbiased estimator of g(θ) which is based on the M.L.E.. Let
X(1) = min{X1 , . . . , Xn } and let T = ni=1 (Xi − X(1) ). Among the estimators of
P

µ, which are based on the M.L.E. and belong to the class D = {δα (X) : δc (X) =
X(1) − cT, c > 0}, find the estimator having the smallest m.s.e., at each parametric
point.

11. Let X1 , . . . , Xn be a random sample from U (0, θ) distribution, where θ ∈ Θ = (0, ∞)


is an unknown parameter. Of the two estimators, the M.M.E. and the M.L.E, of
θ, which one would you prefer with respect to (a) the criterion of the bias; (b) the
criterion of the m.s.e. Among the estimators of θ, which are based on the M.L.E.
and belong to the class D = {δα (X) : δc (X) = cX(n) , c > 0}, find the estimator
having the smallest m.s.e., at each parametric point.

12. Let X1 , . . . , Xn (n ≥ 2) be a random sample from U (θ − 0.5, θ + 0.5) distribution,


where θ ∈ Θ = (−∞, ∞) is an unknown parameter. Let the estimand be g(θ) = θ.
Among the estimators which are based on the M.L.E. and belong to the class D =
{δα (X) : δα (X) = α(X(n) − 0.5) + (1 − α)(X(1) + 0.5), 0 ≤ α ≤ 1}, find the estimator
having the smallest m.s.e., at each parametric point.

13. Let X1 , . . . , Xn be a random sample from the Exp(θ) distribution, where θ ∈ Θ =


(0, ∞) is an unknown parameter. Let the estimand be g(θ) = θr , for some fixed
positive integer r. Among the estimators which are based on the M.L.E. and belong
r
to the class D = {δc (X) = cX , c > 0}, find the estimator having the smallest m.s.e.
at each parametric point. Is this estimator consistent?

3
MSO201a: Probability and Statistics
2019-20-II Semester
Assignment No. 6
Instructor: Neeraj Misra
1. Let X1 , X2 , . . . be a sequence of r.v.s, such that Xn , n = 1, 2, . . ., has the d.f.:
Fn (x) = 0, if x < −n, = x+n 2n
, if −n ≤ x < n, and = 1, if x ≥ n. Does Fn (·)
converge to a d.f., as n → ∞?

2. Let X1 , X2 , . . . be a sequence of i.i.d. r.v.s and let X1:n = min{X1 , . . . , Xn } and


Yn = nX1:n , n = 1, 2, . . .. Find the limiting distributions of X1:n and Yn (as n → ∞)
when (a) X1 ∼ U(0, θ), θ > 0; (b) X1 ∼ Exp(θ), θ > 0.
1
3. Let X1 , X2 , . . . be a sequence of independent r.v.s with P (Xn = x) = 2
, if x =
1 1 P
−n , n , and = 0, otherwise. Show that X n → 0, as n → ∞.
4 4

P P
4. (a) If Xn → a and Xn → b, then show that a = b.
(b) Let a and r > 0 be real numbers. If E(|Xn − a|r ) → 0, as n → ∞, then show
P
that Xn → a.
|X| r |X|r
tr 1+tr
5. (a) For r > 0 and t > 0, show that E( 1+|X| r) − 1+tr
≤ P (|X| ≥ t) ≤ tr
E( 1+|X| r ).
P |Xn | r
(b) Show that Xn → 0 ⇔ E( 1+|X n|
r ) → 0, for some r > 0.

P
6. (a) If {Xn }n≥1 are identically distributed and an → 0, then show that an Xn → 0.
P P P
(b) If Yn ≤ Xn ≤ Zn , n = 1, 2, . . ., Yn → a and Zn → a, then show that Xn → a.
P P
(c) If Xn → c and an → a, then show that, as n → ∞, Xn + an → c + a and
P
an Xn → ac.
(d) Let Xn = min(|Yn |, a), n = 1, 2, . . ., where a is a positive constant. Show that
P P
Xn → 0 ⇔ Yn → 0.

7. Let X1 , X2 , . . . be a sequence of i.i.d. r.v.s with mean µ and finite variance. Show
that:
2
Pn P
(a) n(n+1) i=1 iXi → µ;
6
Pn 2 P
(b) n(n+1)(2n+1) i=1 i Xi → µ.

8. Let Xn , n = 1, 2, . . ., have a negative binomial distribution with parameters n and


θ n+x−1 n

pn = 1 − n , i.e., Xn has the p.m.f. P (Xn = x) = x
pn (1 − pn )x , x = 0, 1, 2, . . .;
d
n = 1, 2, . . .. Show that Xn → X ∼ Poisson(θ).

1
P
9. (a) Let Xn ∼ Gamma( n1 , n), n = 1, 2, . . .. Show that Xn → 1.
d
(b) Let Xn ∼ N ( n1 , 1 − n1 ), n = 1, 2, . . .. Show that Xn → Z ∼ N (0, 1).

10. (a) Let f (x) = x12 , if 1 ≤ x < ∞, and = 0, elsewhere, be the p.d.f. of a r.v. X.
Consider the random sample of size 72 from the distribution having p.d.f. f (·).
Compute, approximately, the possibility that more than 50 of the items of the
random sample are less than 3.
(b)
P100Let X1 , X2 , . . . be a random sample from Poisson(3) distribution and let Y =
i=1 Xi . Find, approximately, P (100 ≤ Y ≤ 200).
(c) Let X ∼ Bin(25, 0.6). Find, approximately, P (10 ≤ X ≤ 16). What is the exact
value of this probability?
k
11. (a) Show that limn→∞ e−n Pnk=0 nk! = 21 .
P
(b) Show that limn→∞ 2−n rk=0 n n
k
= 12 , where rn is the largest integer ≤ n2 .
P P
12. (a) If Tn = max(|X1 |, . . . , |Xn |) → 0, as n → ∞, then show that X n → 0. Is the
P
conclusion true if only Sn = max(X1 , . . . , Xn ) → 0.
1
(b) If {Xn }n≥1 are i.i.d. U (0, 1) r.v.s. and Zn = ( ni=1 Xi ) n , n = 1, 2, . . .. Find a
Q
P
real α such that Zn → α.

13. Let {En }n≥1 be a sequence


Pn of i.i.d. Exp(1) r.v.s.
(a) Show that Tn ≡ i=1 Ei ∼ Gamma(n, 1), n = 1, 2, . . ..
R n+x√n e−t tn−1 Rx t2
(b) For any real number x, show that limn→∞ 0 Γ(n)
dt = √12π −∞ e− 2 dt.
(c) For large values of n, show that an approximation
√ (called the Stirling approxi-
−n n− 21
mation) to the gamma function is: Γn ≈ 2πe n .

14. Let X1 , X2 , . . . be a sequence of i.i.d. r.v.s having the common Cauchy p.d.f. f (x) =
1
· 1 , −∞ < x < ∞.
π 1+x2
(a) For any α ∈ (0, 1), show that Y = αX1 + (1 − α)X2 again has a Cauchy p.d.f.
f (·).
n 1
(b) Note that X n+1 = n+1 X n + n+1 Xn+1 and hence, using induction, conclude that
X n has the same distribution as X1 .
(c) Show that X n does not converge in probability to any constant. (Note that
E(X1 ) does not exist and hence the WLLN is not guaranteed).
Xn
15. Let Xn ∼ Poisson(4n), n = 1, 2, . . ., and let Yn = n
, n = 1, 2, . . ..
P √ P
(a) Show that Yn → 4; (b) Show that Yn2 + Yn → 18;
2 Y 2 +nY
n P
(c) Show that nnY n
n +n
2 → 16.

16. Let X n be the sample mean computed from a random sample of size n from a
distribution

with mean µ (−∞ < µ < ∞) and variance σ 2 (0 < σ < ∞). Let
Zn = n(Xσn −µ) .
P d 16Zn 2 d (4n+Yn )Zn d
4Zn
(a) If Yn → 4, show that: Yn
→ Z ∼ N (0, 1); Yn2
→ U ∼ χ21 ; and (nYn +Yn2 )

2
Z ∼ N (0, 1).
√ d
(b) If σ = 1 and µ > 0, show that: n(ln X n − ln µ) → V ∼ N (0, µ12 );
δ P
(c) Show that n (Xσn −µ) → 0, for any δ < 0.5.
√ 2 2 2
√ Find the 2asymptotic distributions of: (i) n(X n − µ ); (ii) n(X n − µ) and (iii)
(d)
n(X n − µ) .

17. Let X1 , X2 , . . . be i.i.d. r.v.s having Exp(θ) (θ > 0) distribution and let X n =
Pn
i=1 Xi
√ d
n
, n = 1, 2, . . .. Show that: n( X1n − 1θ ) → N (0, θ12 ), as n → ∞.

18. Let X1 , X2 , . . . be a sequence of i.i.d. U (0, 1) r.v.s. For the sequence of geometric
1 √ d
means Gn = ( ni=1 Xi ) n , n = 1, 2, . . ., show that n(Gn − 1e ) → N (0, σ 2 ), for some
Q
σ 2 > 0. Find σ 2 .

19. Let (X1 , Y1 ), (X2 , Y2 ), . . . be a sequence of independent bivariate random vectors


having the same joint p.d.f. Let E(X1P ) = µ, E(Y1 ) = ν, Var(X1P) = σ 2 , Var(Y1 ) =
n n
(X −X )(Y −Y ) (Xi −X n )2
τ 2 and Corr(X1 , Y1 ) = ρ. Let Qn = i=1 i n−1n i n , Sn2 = i=1 n−1 , Tn2 =
Pn 2
i=1 (Yi −Y n ) Qn
n−1
and Rn = Sn Tn
.
P P
(a) Show that Qn → ρτ σ and Rn → ρ.
2 (Y −ν)2 ) √ d
(b) Let δ = E((X1 −µ)
σ2 τ 2
1
. Show that n(Qn − ρστ ) → N (0, (δ − ρ2 )σ 2 τ 2 ).

3
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-V
Instructor: Neeraj Misra

1. For µ ∈ R and λ > 0, let Xµ,λ be a random variable having the p.d.f.
(
1 − x−µ
λ
e λ , if x ≥ µ
fµ,σ (x) = .
0, otherwise

(a) Find Cr (µ, λ) = E((X − µ)r ), r ∈ {1, 2, . . .} and µ0r (µ, λ) = E(Xµ,λ
r
), r ∈ {1, 2};

(b) For p ∈ (0, 1), find the p-th quantile ξp ≡ ξp (µ, λ) of the distribution of Xµ,λ
(Fµ,λ (ξp ) = p, where Fµ,λ is the distribution function of Xµ,λ );

(c) Find the lower quartile q1 (µ, λ), the median m(µ, λ) and the upper quartile
q3 (µ, λ) of the distribution of Xµ,λ ;

(d) Find the mode m0 (µ, λ) of the distribution of Xµ,σ ;

(e) Find the standard deviation σ(µ, λ), the mean deviation about median MD(m(µ, λ)),
the inter-quartile range IQR(µ, λ), the quartile deviation (or semi-inter-quartile
range) QD(µ, λ), the coefficient of quartile deviation CQD(µ, λ) and the coefficient
of variation CV(µ, λ) of the distribution of Xµ,λ ;

(f) Find the coefficient of skewness β1 (µ, λ) and the Yule coefficient of skewness
β2 (µ, λ) of the distribution of Xµ,λ ;

(g) Find the excess kurtosis γ2 (µ, λ) of the distribution of Xµ,λ ;

(h) Based on values of measures of skewness and the kurtosis of the distribution of
Xµ,λ , comment on the shape of fµ,σ .

2. Let X be a random variable with p.d.f. fX (x) that is symmetric about µ (∈ R),
i.e., fX (x + µ) = fX (µ − x), ∀ x ∈ (−∞, ∞).

(a) If q1 , m and q3 are respectively the lower quartile, the median and the upper
quartile of the distribution of X then show that µ = m = q1 +q2
3
;
q1 +q3
(b) If E(X) is finite then show that E(X) = µ = m = 2
.

3. A fair dice is rolled six times independently. Find the probability that on two
occasions we get an upper face with 2 or 3 dots.

1
4. Let n (≥ 2) and r ∈ {1, 2, , n − 1} be fixed integers and let p ∈ (0, 1) be a fixed real
number. Using probabilistic arguments show that

n   n−1    
X n j n−j
X n−1 j n−1−j n−1 r
p (1 − p) − p (1 − p) = p (1 − p)n−r .
j=r
j j=r
j r

5. Let X ∼ Bin(n, p), where n is a positive integer and p ∈ (0, 1). Find the mode of
the distribution of X.

6. Eighteen balls are placed at random in seven boxes that are labeled B1 , . . . , B7 .
Find the probability that boxes with labels B1 , B2 and B3 all together contain six
balls.

7. Let T be a discrete type random variable with support ST = {0, 1, 2, . . .}. Show that
T has the lack of memory property if, and only if, T ∼ Ge(p), for some p ∈ (0, 1).

8. A person repeatedly rolls a fair dice independently until an upper face with two or
three dots is observed twice. Find the probability that the person would require
eights rolls to achieve this.

9. (a) Consider a sequence of independent Bernoulli trials with probability of success


in each trial being p ∈ (0, 1). Let Z denote the number of trials required to get the
r-th success, where r is a given positive integer. Let X = Z − r. Find the marginal
probability distributions of X and Z. For r = 1, show that P (Z > m + n) = P (Z >
m)P (Z > n), m, n ∈ {0, 1, . . .} (or equivalently P (Z > m + n|Z > m) = P (Z >
n), m, n ∈ {0, 1, . . .}; this property is also known as the lack of memory property).
(b) Two teams (say Team A and Team B) play a series of games until one team
wins 5 games. If the probability of Team A (Team B) winning any game is 0.7 (0.3),
find the probability that the series will end in 8 games.

10. (Relation between binomial and negative binomial probabilities) Let n be


a positive integer, r ∈ {1, 2, . . . , n} and let p ∈ (0, 1). Using probabilistic arguments
and also otherwise show that
n   n−r  
X n k n−k r
X r+k−1
p (1 − p) =p . (1 − p)k ,
k=r
k k=0
k

i.e., for r ∈ {1, 2, . . . , n}, P (Bin(n, p) ≥ r) = P (NB(r, p) ≤ n − r).

11. A mathematician carries at all times two match boxes, one in his left pocket and
one in his right pocket. To begin with each match box contains n matches. Each

2
time the mathematician needs a match he is equally likely to take it from either
pocket. Consider the moment when the mathematician for the first time discovers
that one of the match boxes is empty. Find the probability that at that moment
the other box contains exactly k matches, where k ∈ {0, 1, . . . , n}.

12. Consider a population comprising of N (≥ 2) units out of which a (∈ {1, 2, . . . , N −


1}) are labeled as S (success) and N − a are labeled as F (failure). A sample of
size n (∈ {1, 2, . . . , N − 1}) is drawn from this population, drawing one unit at a
time and without replacing it back into the population (i.e., sampling is without
replacement). Let Ai (i = 1, 2, . . . , n) denote the probability of obtaining success
(S) in the i-th trial. Show that P (Ai ) = Na , i = 1, . . . , n (i.e., probability of
obtaining S remains the same in each trial). Hint: Use induction by first showing
that P (A1 ) = P (A2 ) = Na .

13. (a) (Binomial approximation to hypergeometric distribution) Show that a


hypergeometric distributed can be approximated by a Bin(n, p) distribution pro-
vided N is large (N → ∞), a is large (a → ∞) and a/N → p, as N → ∞;
(b) Out of 120 applicants for a job, only 80 are qualified for the job. Out of 120
applicants, five are selected at random for the interview. Find the probability that
at least two selected applicants will be qualified for the job. Using Problem 13 (a)
find an approximate value of this probability.

14. Urn Ui (i = 1, 2) contains Ni balls out of which ri are red and Ni − ri are black.
A sample of n (1 ≤ n ≤ N1 ) balls is chosen at random (without replacement) from
urn U1 and all the balls in the selected sample are transferred to urn U2 . After the
transfer two balls are drawn at random from the urn U2 . Find the probability that
both the balls drawn from urn U2 are red.

15. Let X ∼ Po(λ), where λ > 0. Find the mode of the distribution of X.

16. (a) (Poisson approximation to negative binomial distribution) Show that


−λ k
limr→∞ r+k−1 pr (1 − pr )k = e k!λ , k = 0, 1, 2, . . ., provided limr→∞ pr = 1 and
 r
k
limr→∞ (r(1 − pr )) = λ > 0.
(b) Consider a person who plays a series of 2500 games independently. If the proba-
bility of person winning any game is 0.002, find the probability that the person will
win at least two games.

17. (a) If X ∼ Po(λ), find E((2 + X)−1 );


(b) If X ∼ Ge(p) and r is a positive integer, find E(min(X, r)).

3
18. A person has to open a lock whose key is lost among a set of N keys. Assume that
out of these N keys only one can open the lock. To open the lock the person tries
keys one by one by choosing, at each attempt, one of the keys at random from the
unattempted keys. The unsuccessful keys are not considered for future attempts.
Let Y denote the number of attempts the person will have to make to open the lock.
Show that Y ∼ U ({1, 2, . . . , N }) and hence find the mean and the variance of the
r.v. Y .

19. Let a > 0 be a real constant. A point X is chosen at random on the interval (0, a)
(i.e., X ∼ U (0, a)).
(a) If Y denotes the area of equilateral triangle having sides of length X, find the
mean and variance of Y ;
(b) If the point X divides the interval (0, a) into subintervals I1 = (0, X) and
I2 = [X, a), find the probability that the larger of these two subintervals is at least
the double the size of the smaller subinterval.

20. Using a random observation U ∼ U (0, 1), describe a method to generate a random
observation X from the distribution having
(a) probability density function

e−|x|
f (x) = , −∞ < x < ∞;
2

(b) probability mass function


(
n
 x
x
θ (1 − θ)n−x , if x ∈ {0, 1, . . . , n}
g(x) = ,
0, otherwise

where n ∈ N and θ ∈ (0, 1) are real constants.

21. Let X ∼ U (0, θ), where θ is a positive integer. Let Y = X − [X], where [x] is the
largest integer ≤ x. Show that Y ∼ U (0, 1).

22. Let U ∼ U (0, 1) and let X be the root of the equation 3t2 − 2t3 − U = 0. Show that
X has p.d.f. f (x) = 6x(1 − x), if 0 ≤ x ≤ 1, = 0, otherwise.

23. (Relation between gamma and Poisson probabilities) For t > 0, θ > 0 and
n ∈ {1, 2, . . .}, using integration by parts, show that

∞ n−1 − t t j
e θ( )
Z
1 − xθ n−1
X
θ
n
e x dx = ,
θ Γ(n) t j=0
j!

i.e., P (G(n, θ) > t) = P (Po( θt ) ≤ n − 1).

4
24. Let Y be a random variable of continuous type with FY (0) = 0, where FY (·) is the
distribution function of Y . Show that Y has the lack of memory property if, and
only if, Y ∼ Exp(θ), for some θ > 0.

25. (Relation between binomial and beta probabilities) Let n be a positive


integer, k ∈ {0, 1, . . . , n} and let p ∈ (0, 1). If X ∼ Bin(n, p) and Y ∼ Be(k, n − k +
1), show that P ({X ≥ k}) = P ({Y ≤ p}).

26. Let X = (X1 , X2 )0 have the joint p.d.f.

fX (x1 , x2 ) = φ(x1 )φ(x2 )[1+α(2Φ(x1 )−1)(2Φ(x2 )−1)], −∞ < xi < ∞, i = 1, 2, |α| ≤ 1.

(a) Verify that fX (x1 , x2 ) is a p.d.f.;


(b) Find the marginal p.d.f.s of X1 and X2 ;
(c) Is (X1 , X2 ) jointly normal?

27. Let X = (X1 , X2 )0 ∼ N2 (µ1 , µ2 , σ12 , σ22 , ρ) and, for constants a1 , a2 , a3 and a4 (ai 6= 0,
i = 1, 2, 3, 4, a1 a4 6= a2 a3 ), let Y = a1 X1 + a2 X2 and Z = a3 X1 + a4 X2 .
(a) Find the joint p.d.f. of (Y, Z);
(b) Find the marginal p.d.f.s. of Y and Z.

28. (a) Let (X, Y )0 ∼ N2 (5, 8, 16, 9, 0.6). Find P ({5 < Y < 11}|{X = 2}), P ({4 < X <
6}) and P ({7 < Y < 9});
(b) Let (X, Y )0 ∼ N2 (5, 10, 1, 25, ρ), where ρ > 0. If P ({4 < Y < 16}|{X = 5}) =
0.954, determine ρ.

29. Let X and Y be independent and identically distributed N (0, σ 2 ) r.v.s.


(a) Find the joint p.d.f. of (U, V ), where U = aX + bY and V = bX − aY (a 6=
0, b 6= 0);
(b) Show that U and V are independent;
X+Y X−Y
(c) Show that √
2
and √
2
are independent N (0, σ 2 ) random variables.

30. Let X = (X1 , X2 )0 ∼ N2 (0, 0, 1, 1, ρ).


(a) Find the m.g.f. of Y = X1 X2 ;
(b) Using (a), find E(X12 X22 );
(c) Using conditional expectation, find E(X12 X22 ).

5
31. Let X = (X1 , X2 )0 have the joint p.d.f.
(
1 − 12 (x2 +y 2 )
π
e , if xy > 0
f (x, y) = .
0, otherwise

Show that marginals of f (·, ·) are each N (0, 1) but f (·, ·) is not the p.d.f. of a
bivariate normal distribution.

32. Let fr (·, ·), −1 < r < 1, denote the pdf of N2 (0, 0, 1, 1, r) and, for a fixed ρ ∈ (−1, 1),
let the random variable (X, Y ) have the joint p.d.f.

1
gρ (x, y) = [fρ (x, y) + f−ρ (x, y)].
2

Show that X and Y are normally distributed but the distribution of (X, Y ) is not
bivariate normal.

33. Consider the random vector (X, Y ) as defined in Problem 32.


(a) Find Corr(X, Y );
(b) Are X and Y independent?

34. (a) Suppose that X = (X1 , X2 , X3 , X4 ) ∼ Mult(30, θ1 , θ2 , θ3 , θ4 ). Find the con-


ditional probability mass function of (X1 , X2 , X3 , X4 ) given that X5 = 2, where
X5 = 30 − 4i=1 Xi .
P

(b) Consider 7 independent casts of a pair of fair dice. Find the probability that
once we get a sum of 12 and twice we get a sum of 8.
n2
35. (a) Let X ∼ Fn1 ,n2 . Show that Y = n2 +n 1X
∼ Be( n21 , n22 );
(b) Let Z1 and Z2 be i.i.d. N (0, 1) random variables. Show that |ZZ12 | ∼ t1 and
Z1
Z2
∼ t1 (Cauchy distribution);
(c) Let X1 and X2 be i.i.d. Exp(θ) random variables, where θ > 0. Show that
Z=X 1
X2
has a F -distribution;
(d) Let X1 , X2 and X3 be independent random variables with Xi ∼ χ2ni , i = 1, 2, 3.
X1 /n1
Show that X 2 /n2
and (X1 +XX23)/(n
/n3
1 +n2 )
are independent F -variables.

36. Let X1 , . . . , Xn be a random sample from Exp(θ) distribution, where θ > 0. Let
X1:n ≤ · · · ≤ Xn:n denote the order statistics of X1 , . . . , Xn . Define Z1 = nX1:n , Zi =
(n − i + 1)(Xi:n − Xi−1:n ), i = 2, . . . , n. Show that Z1 , . . . , Zn are independent and
identically distributed Exp(θ) random variables. Hence find the mean and variance
of Xr:n , r = 1, . . . , n. Also, for 1 ≤ r < s ≤ n, find Cov(Xr:n , Xs:n ).

6
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-IV
Instructor: Neeraj Misra

1. Let (
1, if x + 2y ≥ 1
F (x, y) = .
0, if x + 2y < 1
Does F (·, ·) define a d.f.?

2. Let (
0, if x < 0 or x + y < 1 or y < 0
F (x, y) = .
1, otherwise
Does F (·) define a d.f.?

3. Let X = (X1 , X2 ) be a bivariate random vector having the d.f.




 0, if x < 0 or y < 0
 1+xy
, if 0 ≤ x < 1, 0 ≤ y < 1


2


1+x
F (x, y) = 2
, if 0 ≤ x < 1, y ≥ 1 .
 1+y
, if x ≥ 1, 0 ≤ y < 1


2



1 if x ≥ 1, y ≥ 1

(a) Verify that F is a d.f.; (b) Determine whether X is of discrete or continuous; (c)
Find the marginal distribution functions of X1 and X2 ; (d) Find P ( 21 ≤ X1 ≤ 1, 14 <
X2 < 12 ), P (X1 = 1) and P (X1 ≥ 32 , X2 < 14 ); (e) Are X1 and X2 independent?

4. Let X = (X1 , X2 ) be a bivariate random vector having the d.f.




 0, if x < 0 or y < 1
y 2 −1

, if 0 ≤ x < 1, 1 ≤ y < 2


6


1
F (x, y) = 2
, if 0 ≤ x < 1, y ≥ 2 .
 2
y −1
, if x ≥ 1, 1 ≤ y < 2


3



1 if x ≥ 1, y ≥ 2

(a) Verify that F is a d.f.; (b) Determine whether X is a discrete or continuous


r.v.; (c) Find the marginal distribution functions of X1 and X2 ; (d) Find P ( 12 ≤
X1 ≤ 1, 45 < X2 < 32 ), P (X1 = 1) and P (X1 ≥ 32 , X2 < 54 ); (e) Are X1 and X2
independent?

1
5. Let the r.v. X = (X1 , X2 )0 have the joint p.m.f.
(
c(x1 + 2x2 ), if x1 = 1, 2, x2 = 1, 2
fX (x1 , x2 ) = ,
0, otherwise

where c is a real constant. (a) Find the constant c; (b) Find marginal p.m.f.s of
X1 and X2 ; (c) Find conditional variance of X2 given X1 = x1 , x1 = 1, 2; (d) Find
P (X1 < X32 ), P (X1 = X2 ), P (X1 ≥ X22 ) and P (X1 + X2 ≤ 3); (e) Find ρ(X1 , X2 );
(f) Are X1 and X2 independent?

6. Let the r.v. X = (X1 , X2 )0 have the joint p.m.f.


(
cx1 x2 , if x1 = 1, 2, x2 = 1, 2, x1 ≤ x2
fX (x1 , x2 ) = ,
0, otherwise

where c is a real constant. (a) Find the constant c; (b) Find marginal p.m.f.s of
X1 and X2 ; (c) Find conditional variance of X2 given X1 = 1; (d) Find P (X1 >
X2 ), P (X1 = X2 ), P (X1 < 32 X2 ) and P (X1 + X2 ≥ 3); (e) Find ρ(X1 , X2 ); (f) Are
X1 and X2 independent?

7. Let (X, Y ) be a random vector such that the p.d.f. of X is


(
4x(1 − x2 ), if 0 < x < 1
fX (x) = ,
0, otherwise

and, for fixed x ∈ (0, 1), the conditional p.d.f. of Y given X = x is


(
c(x)y, if x < y < 1
fY |X (y|x) = ,
0, otherwise

where c : (0, 1) → R is a given function. (a) Determine c(x), 0 < x < 1; (b) Find
marginal p.d.f. of Y ; (c) Find the conditional variance of X given Y = y, y ∈ (0, 1);
(d) Find P (X < Y2 ), P (X + Y ≥ 43 ) and P (X = 2Y ); (e) Find ρ(X, Y ); (f) Are X
and Y independent?

8. Let X = (X1 , X2 , X3 ) be a random vector with joint p.d.f.


(
c
x1 x2
, if 0 < x3 < x2 < x1 < 1
fX (x) = ,
0, otherwise

where c is a real constant. (a) Find the value of constant c; (b) Find marginal p.d.f.
of X2 ; (c) Find the conditional variance of X2 given (X1 , X3 ) = (x, y), 0 < y <

2
x < 1; (d) Find P (X2 < X21 ) and P (X3 = 2X2 > X1
2
); (e) Find ρ(X1 , X2 ); (f) Are
X1 , X2 , X3 independent?.

9. Let X = (X1 , X2 , X3 )0 be a random vector with p.m.f.


(
1
4
,if (x1 , x2 , x3 ) ∈ A
fX1 ,X2 ,X3 (x1 , x2 , x3 ) = ,
0, otherwise

where A = {(1, 0, 0), (0, 1, 0), (0, 0, 1), (1, 1, 1)}. (a) Are X1 , X2 , X3 independent?;
(b) Are X1 , X2 , X3 pairwise independent?; (c) Are X1 + X2 and X3 independent?

10. Let X = (X1 , X2 , X3 )0 be a random vector with joint p.d.f.

1 1 2 2 2
 1 2 2 2

fX (x1 , x2 , x3 ) = 3 e− 2 (x1 +x2 +x3 ) 1 + x1 x2 x3 e− 2 (x1 +x2 +x3 ) , −∞ < xi < ∞,
(2π) 2

i = 1, 2, 3. (a) Are X1 , X2 , X3 independent?; (b) Are X1 , X2 , X3 pairwise indepen-


dent?; (c) Find the marginal p.d.f.s of (X1 , X2 ), (X1 , X3 ) and (X2 , X3 ).

11. Let (X, Y, Z) have the joint p.m.f. as follows:

(x, y, z) (1, 1, 0) (1, 2, 1) (1, 3, 0) (2, 1, 1) (2, 2, 0) (2, 3, 1)


2 4 3 1 1 4
fX,Y,Z (x, y, z) 15 15 15 15 15 15

and fX,Y (x, y) = 0, elsewhere. (a) Are X + Y and Z independent?; (b) Find ρ =
Corr(X+Y,Z).

12. Let X = (X1 , X2 , X3 )0 be a random vector with p.d.f.


(
2e−(x2 +2x3 ) , if 0 < x1 < 1, x2 > 0, x3 > 0
fX1 ,X2 ,X3 (x1 , x2 , x3 ) = .
0, otherwise

(a) Are X1 , X2 , X3 independent?; (b) Are X1 + X2 and X3 independent?; (c) Find


marginal p.d.f.s of X1 , X2 and X3 ; (d) Find conditional p.d.f. of X1 given X2 = 2.

13. Let X1 , . . . , Xn be n r.v.s with E(Xi ) = µi , Var(Xi ) = σi2 and ρij = Corr(Xi , Xj ),
i, j = 1, . . . , n, i 6= j. For real numbers ai , bi , i = 1, . . . , n, define Y = ni=1 ai Xi
P

and Z = ni=1 bi Xi . Find Cov(Y, Z).


P

14. Let X and Y be jointly distributed random variables with E(X) = E(Y ) =
0, E(X 2 ) = E(Y 2 ) = 2 and Corr(X, Y ) = 1/3. Find Corr( X3 + 2Y3 , 2X
3
+ Y3 ).

15. Let X1 , . . . , Xn be random variables and let p1 , . . . , pn be positive real numbers


Pn p Pn Pn p
with p i = 1. Prove that: (a) Var( p i X i ) ≤ p i Var(Xi ) ≤
pPn i=1 Pn i=1 i=1
i=1 Xi 1
Pn
i=1 pi Var(Xi ); (b) Var( n
) ≤ n i=1 Var(Xi ).

3
Pn Pn
16. Let (xi , yi ) ∈ R2 , i = 1, . . . , n be such that i=1 xi = i=1 yi = 0. Using a statistical
argument show that

n
!2 n
! n
!
X X X
xi y i ≤ x2i yi2 .
i=1 i=1 i=1

17. Let (X, Y ) have the joint p.m.f. as follows:

(x, y) (1, 1) (1, 2) (1, 3) (2, 1) (2, 2) (2, 3)


2 4 3 1 1 4
fX,Y (x, y) 15 15 15 15 15 15

and fX,Y (x, y) = 0, elsewhere. Find ρ = Corr(X,Y).


t2
1
e 1−2t2
18. Let the joint m.g.f. of (Y, Z) be MY,Z (t1 , t2 ) = , t2 < 12 . (a) Find Corr(Y, Z);
1−2t2
(b) Are Y and Z independent?; (c) Find m.g.f. of Y + Z.
t2 2
1 +t2 +t1 t2
19. Let the joint m.g.f. of (Y, Z) be MY,Z (t1 , t2 ) = e 2 , (t1 , t2 ) ∈ R2 . (a) Find
Corr(Y, Z); (b) Are Y and Z independent?; (c) Find m.g.f. of Y − Z.

20. Let X = (X1 , X2 ) have the joint p.m.f.


(
( 23 )x1 +x2 ( 13 )2−x1 −x2 , if (x1 , x2 ) = (0, 0), (0, 1), (1, 0), (1, 1)
fX (x1 , x2 ) = .
0, otherwise

(a) Find the joint p.m.f. of Y1 = X1 − X2 and Y2 = X1 + X2 ; (b) Find the


marginal p.m.f.s of Y1 and Y2 ; (c) Find Var(Y2 ) and Cov(Y1 , Y2 ); (d) Are Y1 and Y2
independent?

21. Let X1 , . . . , Xn be a random sample of continuous random variables and let X1:n <
X2:n · · · < Xn:n be the corresponding order statistics. If the expectation of X1 is
finite and the distribution of X1 is symmetric about µ ∈ (−∞, ∞), show that:
d
(a) Xr:n − µ = µ − Xn−r+1:n , r = 1, . . . , n; (b) E(Xr:n + Xn−r+1:n ) = 2µ; (c)
E(X n+1 :n ) = µ, if n is odd; (d) P (X n+1 :n > µ) = 0.5, if n is odd.
2 2

22. (a) Let X1 , . . . , Xn denote a random sample, where P (X1 > 0) = 1. Show that
 
X1 + X2 + · · · + Xk k
E = , k = 1, 2, . . . , n.
X1 + X 2 + · · · + X n n

(b) Let X1 , . . . , Xn be a random sample and let E(X1 ) be finite. Find the conditional
expectation E(X1 |X1 + · · · + Xn = t), where t ∈ R is such that the conditional
expectation is defined.

4
(c) Let X1 , . . . , Xn be a random sample of random variables. Find P (X1 < X2 <
· · · < Xr ), r = 2, 3, . . . , n.

23. Let X1 and X2 be independent and identically distributed random variables with
common p.m.f. (
θ(1 − θ)x−1 , if x = 1, 2, . . .
f (x) = ,
0, otherwise
where θ ∈ (0, 1). Let Y1 = min{X1 , X2 } and Y2 = max{X1 , X2 } − min{X1 , X2 }. (a)
Find the marginal p.m.f. of Y1 without finding the joint p.m.f. of Y = (Y1 , Y2 ); (b)
Find the marginal p.m.f. of Y2 without finding the joint p.m.f. of Y = (Y1 , Y2 ); (c)
Find the joint p.m.f. of Y = (Y1 , Y2 ); (d) Are Y1 and Y2 independent; (e) Using (c),
find the marginal p.m.f.s of Y1 and Y2 .

24. Let X = (X1 , X2 , X3 )0 be a random vector with p.m.f.



2
 9 , if (x1 , x2 , x3 ) = (1, 1, 0), (1, 0, 1), (0, 1, 1)

1
fX1 ,X2 ,X3 (x1 , x2 , x3 ) = 3
, if (x1 , x2 , x3 ) = (1, 1, 1) .

0, otherwise

Define Y1 = X1 + X2 and Y2 = X2 + X3 . (a) Find the marginal p.m.f. of Y1 without


finding the joint p.m.f. of Y = (Y1 , Y2 ); (b) Find the marginal p.m.f. of Y2 without
finding the joint p.m.f. of Y = (Y1 , Y2 ); (c) Find the joint p.m.f. of Y = (Y1 , Y2 );
(d) Are Y1 and Y2 independent; (e) Using (c), find the marginal p.m.f.s of Y1 and
Y2 .

25. Let X1 and X2 be independent random variables with p.d.f.s


( (
1, if 0 < x < 1 1, if 1 < x < 2
f1 (x) = and f2 (x) = ,
0, otherwise 0, otherwise

respectively. Let Y = X1 + X2 and Z = X1 − X2 . (a) Find the d.f. of Y and hence


find its p.d.f.; (b) Find the joint p.d.f. of (Y, Z) and hence find the marginal p.d.f.s
of Y and Z; (c) Are Y and Z independent?

26. Let X1 and X2 be i.i.d. random variables with common p.d.f.


(
1
2
,if − 1 < x < 1
f (x) = .
0, otherwise

Let Y = |X1 | + X2 and Z = X2 . (a) Find the d.f. of Y and hence find its p.d.f.; (b)
Find the joint p.d.f. of (Y, Z) and hence find the marginal p.d.f.s of Y and Z; (c)
Are Y and Z independent?

5
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-III
Instructor: Neeraj Misra

1. Let X be a random variable with p.m.f.


(
1 2 x
( ) ,
3 3
if x ∈ {0, 1, 2, . . .}
fX (x) = .
0, otherwise

(a) Find the distribution function of Y = X/(X + 1) and hence determine the
p.m.f. of Y ;
(b) Find the p.m.f. of Y = X/(X + 1) and hence determine the distribution
function of Y ;
(c) Find the mean and the variance of X.

2. Let the random variable X have the p.d.f.



1
 2 , if − 2 < x < −1

1
fX (x) = 6
, if 0 < x < 3 .

0, otherwise

(a) Find the distribution function of Y = X 2 and hence determine the p.d.f. of Y ;
(b) Find the probability density function of Y = X 2 and hence determine the
distribution function of Y ;
(c) Find the mean and the variance of X.

3. (a) Give an example of a discrete random variable X for which E(X) is finite but
E(X 2 ) is not finite;
(b) Give an example of a continuous random variable X for which E(X) is finite
but E(X 2 ) is not finite.

4. Let X be a random variable with


1 2
P (X = −2) = 21 , P (X = −1) = 21 , P (X = 0) = 17 ,
4 5
P (X = 1) = 21 , P (X = 2) = 21 , P (X = 3) = 72 .

Find the p.m.f. and distribution function of Y = X 2 .

1
5. Let X be a random variable with p.d.f.
(
1, if 0 < x < 1
fX (x) = .
0, otherwise

Find the p.d.f.s of the following random variables: (a) Y1 = X; (b) Y2 = X 2 ; (c)
Y3 = 2X + 3; (d) Y4 = − ln X.

6. Let the random variable X have the p.d.f.


(
6x(1 − x), if 0 < x < 1
fX (x) = ,
0, otherwise

and let Y = X 2 (3 − 2X).

(a) Find the distribution function of Y and hence find its p.d.f.;
(b) Find the p.d.f. of Y directly (i.e., without finding the distribution function);
(c) Find the mean and the variance of Y .

7. (a) From a box containing N identical tickets, numbered, 1, 2, . . . , N , n (≤ N )


tickets are drawn at random with replacement. Let X = largest number drawn.
Find E(X).
(b) Find the expected number of throws of a fair die required to obtain a 6.
√ √
8. Consider a target comprising of three concentric circles of radii 1/ 3, 1 and 3 feet.
Shots within the inner circle earn 4 points, within the next ring 3 points and within
the outer ring 2 points. Shots outside the target do not earn any point. Let X
denote the distance (in feet) of the hit from the centre and suppose that X has the
p.d.f. (
2
π(1+x2 )
, if x > 0
fX (x) = .
0, otherwise
Find the expected score in a single shot.

9. (a) Let X be a random variable with p.d.f.


(
1, if 0 < x < 1
fX (x) = ,
0, otherwise

and let Y = min(X, 1/2). Examine whether or not Y is a discrete or a contin-


uous random variable. (Note: Function of a continuous random variable may
neither be discrete nor continuous).

2
(b) Let the random variable X have the p.d.f.

1 x2
fX (x) = √ e− 2 , −∞ < x < ∞,

and let 
 −1, if X < 0

1
Y = 2
, if X = 0 .

1, if X > 0

Examine whether Y is discrete or continuous random variable. (Note: Func-


tion of a continuous random variable may be a discrete random variable.)

10. (a) Let E(|X|β ) < ∞, for some β > 0. Then show that E(|X|α ) < ∞, ∀ α ∈ (0, β];
(b) Let X be a random variable with finite expectation. Show that limx→−∞ xFX (x) =
limx→∞ [x(1 − FX (x))] = 0, where FX is the distribution function of X;
(c) Let X be a random variable with limx→∞ [xα P (|X| > x)] = 0, for some α > 0.
Show that E(|X|β ) < ∞, ∀ β ∈ (0, α). What about E(|X|α )?

11. (a) Find the moments of the random variable that has the m.g.f. M (t) = (1 − t)−3 ,
t < 1;
(b) Let the random variable X have the m.g.f.

e−t et e2t e3t


M (t) = + + + , t ∈ R.
8 4 8 2

Find the distribution function of X and find P (X 2 = 1).

(c) If the m.g.f. of a random variable X is

et − e−2t
M (t) = , for t 6= 0,
3t

find the p.d.f. of Y = X 2 .

12. Let p ∈ (0, 1) and let Xp be a random variable with p.m.f.


(
n

x
px q n−x , if x ∈ {0, 1, . . . , n}
fXp (x) = ,
0, otherwise

where n is a given positive integer and q = 1 − p.

(a) Find the m.g.f. of Xp and hence find the mean and variance of Xp , p ∈ (0, 1);

3
(b) Let Yp = n − Xp , p ∈ (0, 1). Using the m.g.f. of Xp show that the p.m.f. of Yp
is ( 
n y
y
q (1 − q)n−y , if y ∈ {0, 1, . . . , n}
fYp (y) = .
0, otherwise

13. (a) For any random variable X having the mean µ and finite second moment, show
that E((X − µ)2 ) ≤ E((X − c)2 ), ∀c ∈ R;
(b) Let X be a continuous random variable with distribution function FX that is
strictly increasing on its support. Let m be the median of (distribution of) X.
Show that E(|X − m|) ≤ E(|X − c|), ∀ c ∈ (−∞, ∞).

14. (a) Let X be a non-negative continuous random variable (i.e., P (X ≥ 0) = 1) and


Rx
let h be a real-valued function defined on (0, ∞). Define ψ(x) = 0 h(t)dt,
x ≥ 0, and suppose that h(x) ≥ 0, ∀ x ≥ 0. Show that
Z ∞
E(ψ(X)) = h(y)P (X > y)dy;
0

(b) Let α be a positive real number. Under the assumptions of (a), show that
Z ∞
α
E(X ) = α xα−1 P (X > x)dx;
0

(c) Let F (0) = G(0) = 0 and let F (t) ≥ G(t), ∀ t > 0, where F and G are
distribution functions of continuous random variables X and Y , respectively.
Show that E(X k ) ≤ E(Y k ), ∀ k > 0, provided the expectations exist.

15. (a) Let X be a random variable such that P (X ≤ 0) = 0 and let µ = E(X) be
finite. Show that P (X ≥ 2µ) ≤ 0.5;
(b) If X is a random variable such that E(X) = 3 and E(X 2 ) = 13, then determine
a lower bound for P (−2 < X < 8).

16. (a) An enquiry office receives, on an average, 25, 000 telephone calls a day. What
can you say about the probability that this office will receive at least 30, 000
telephone calls tomorrow?
(b) An enquiry office receives, on an average, 20, 000 telephone calls per day with
a variance of 2, 500 calls. What can be said about the probability that this
office will receive between 19, 900 and 20, 100 telephone calls tomorrow? What
can you say about the probability that this office will receive more than 20, 200
telephone calls tomorrow?

17. Let X be a random variable with m.g.f. M (t), −h < t < h.

4
(a) Prove that P (X ≥ a) ≤ e−at M (t), 0 < t < h;
(b) Prove that P (X ≤ a) ≤ e−at M (t), −h < t < 0;
−1 3 −1
(c) Suppose that M (t) = 41 1 − 3t + 4 1 − 2t , t < 13 . Find P (X > 1).

18. Let µ ∈ R and σ > 0 be real constants and let Xµ,σ be a random variable having
p.d.f.
1 (x−µ)2
fXµ,σ (x) = √ e− 2σ2 , −∞ < x < ∞.
σ 2π
(a) Show that fXµ,σ is a p.d.f.;
(b) Show that the probability distribution function of Xµ,σ is symmetric about µ.
Hence find E(Xµ,σ );
(c) Find the m.g.f. of Xµ,σ and hence find the mean and variance of Xµ,σ ;
(d) Let Yµ,σ = aXµ,σ + b, where a 6= 0 and b are real constants. Using the m.g.f.
of Xµ,σ , show that the p.d.f. of Yµ,σ is

1 (y−(aµ+b))2
fYµ,σ (y) = √ e− 2a2 σ2 , −∞ < y < ∞.
|a|σ 2π

19. Let X be a random variable with p.d.f.



 1
π
√ 1
, if 0 < x < 1,
x(1−x)
f (x) = .
 0, otherwise

Show that the distribution of X is symmetric about a point µ. Find this point µ.
Also find E(X) and P (X > µ).

20. Let X be a random variable with p.d.f.

1 x2
f (x) = √ e− 2 , −∞ < x < ∞.

d
Show that X = −X. Hence find E(X 3 ) and, P (X > 0).

21. (a) Let X be a random variable with E(X) = 1. Show that E(e−X ) ≥ 31 ;
(b) For pairs of positive real numbers (ai , bi ), i = 1, . . . , n and r ≥ 1, show that

n
! n
!r−1 n
!r
X X X
ari bi bi ≥ ai b i .
i=1 i=1 i=1

5
Hence show that, for any positive real number m,

n
! n
! n
!2
X X X
ai2m+1 ai ≥ am+1
i .
i=1 i=1 i=1

22. Let X be a random variable such that P (X > 0) = 1. Show that:

(a) E(X 2m+1 ) ≥ (E(X))2m+1 , m ∈ {1, 2, . . .};


(b) E(XeX ) + eE(X) ≥ E(X)eE(X) + E(eX ),

provided the involved expectations are finite.

6
MSO 201a: Probability and Statistics
2019-20-II Semester
Assignment-II
Instructor: Neeraj Misra

1. Let D be the set of discontinuity points of a distribution function F . For each


n ∈ {1, 2, . . .}, define Dn = {x ∈ R : F (x) − F (x−) ≥ n1 }. Show that each
Dn (n = 1, 2, . . .) is finite. Hence show that a distribution function can not have
uncountable number of discontinuities.

2. Do the following functions define distribution functions?



 0, if x < 0
 (
1
0, if x < 0
(i) F1 (x) = x, if 0 ≤ x ≤ 2
; (ii) F2 (x) = −x
;
 1 − e , if x ≥ 0
1, if x > 12

1 tan−1 (x)
and (iii) F3 (x) = 2
+ π
, −∞ < x < ∞.

3. Let X be a random variable with distribution function




 0, if x < 0

2
, if 0 ≤ x < 1


3


7−6c
F (x) = 6
, if 1 ≤ x < 2 ,
2

4c −9c+6
, if 2 ≤ x ≤ 3


4



1, if x > 3

where c is a real constant.


(i) Find the value of constant c;
(ii) Using the distribution function F , find P (1 < X < 2); P (2 ≤ X < 3); P (0 <
X ≤ 1); P (1 ≤ X ≤ 2); P (X ≥ 3); P (X = 25 ) and P (X = 2);
(iii) Find the conditional probabilities P (X = 1|1 ≤ X ≤ 2) and P (1 ≤ X < 2|X >
1).
(iv) Show that X is a discrete r.v.. Find the support and the p.m.f. of X.

4. Let X be a random variable with distribution function F (·). In each of the following
cases determine whether X is a discrete r.v. or a continuous r.v.. Also find the

1
p.d.f./p.m.f. of X:


 0, if x < −2

1
 3 , if −2≤x<0

 (

1
0, if x < 0
(i) F (x) = 2
, if 0≤x<5 ; (ii) F (x) = −x
.

3
1 − e , if x ≥ 0
, if 5≤x<6


4



1, if x≥6

5. Let the random variable X have the distribution function





 0, if x < 0
 x , if 0 ≤ x < 1

3
F (x) = 2
.


 3
, if 1 ≤ x < 2

 1, if x ≥ 2

(i) Show that X is neither a discrete r.v. nor a continuous r.v.;


(ii) Evaluate P (X = 1), P (X = 2), P (X = 1.5) and P (1 < X < 2);
(iii) Evaluate the conditional probability P (1 ≤ X < 2|1 ≤ X ≤ 2).

6. A random variable X has the distribution function





 0, if x < 2
2




 3
, if 2 ≤ x < 5
7−6k


6
, if 5 ≤ x < 9
F (x) = 3k 2 −6k+7 ,


 6
, if 9 ≤ x < 14
16k2 −16k+19




 16
, if 14 ≤ x ≤ 20

 1, if x > 20

where k ∈ R.
(i) Find the value of constant k;
(ii) Show that X is a discrete r.v. and find its support;
(iii) Find the p.m.f. of X.

7. A discrete random variable X has the p.m.f.


(
c
(2x−1)(2x+1)
, if x ∈ {1, 2, 3, . . .}
f (x) = ,
0, otherwise

where c ∈ R.
(i) Find the value of constant c;
(ii) For positive integers m and n such that m < n, using the p.m.f. evaluate

2
P (X < m + 1), P (X ≥ m), P (m ≤ X < n) and P (m < X ≤ n);
(iii) Find the conditional probabilities P (X > 1|1 ≤ X < 4) and P (1 < X < 6|X ≥
3).
(iv) Determine the distribution function of X.

8. Let X be a random variable with distribution function





 0, if x < 0
 x2 , if 0 ≤ x < 1

2
F (x) = x
.


 2
, if 1 ≤ x < 2

 1, if x ≥ 2

(i) Show that X is a continuous r.v.;


(ii) Using the distribution function, evaluate P (X = 1); P (X = 2); P (1 < X <
2); P (1 ≤ X < 2); P (1 < X ≤ 2); P (1 ≤ X ≤ 2) and P (X ≥ 1);
(iii) Find the p.d.f. of X;
(iv) Find the lower quartile, the median and the upper quartile of F .

9. Let X be an absolutely continuous type random variable with p.d.f.


(
k − |x|, if |x| < 21
f (x) = ,
0, otherwise

where k ∈ R.
(i) Find the value of constant k;
(ii) Using the p.d.f., evaluate P (X < 0), P (X ≤ 0), P (0 < X ≤ 14 ), P (0 ≤ X < 41 )
and P (− 18 ≤ X ≤ 14 );
(iii) Find the conditional probabilities P (X > 14 ||X| > 25 ) and P ( 18 < X < 52 | 10
1
<
1
X < 5 );
(iv) Find the distribution function F of X;
(v) Find the lower quartile, the median and the upper quartile of F .

3
Statistical Inference
Test Set 1

1. Let X ~ Ρ(λ ). Find unbiased estimators of (i ) λ 3 , (ii ) e − λ cos λ , (iii) sin λ . (iv) Show that
there does not exist unbiased estimators of 1/ λ , and exp{−1/ λ}.
2. Let X 1 , X 2 ,..., X n be a random sample from a N ( µ , σ 2 ) population. Find unbiased and
µ
consistent estimators of the signal to noise ration and quantile µ + bσ , where b is
σ
any given real.
3. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , 2θ ) population. Find an unbiased and
consistent estimator of θ .
4. Let X 1 , X 2 be a random sample from an exponential population with mean 1/ λ . Let
X1 + X 2
= T1 = , T2 X 1 X 2 . Show that T1 is unbiased and T2 is biased. Further, prove that
2
MSE (T2 ) ≤ Var (T1 ) .
5. Let T1 and T2 be unbiased estimators of θ with respective variances σ 12 and σ 22 and
cov(T1 , T2 ) = σ 12 (assumed to be known). Consider T = α T1 + (1 − α )T2 , 0 ≤ α ≤ 1. Show
that T is unbiased and find value of α for which Var (T ) is minimized.
6. Let X 1 , X 2 ,..., X n be a random sample from an Exp ( µ , σ ) population. Find the method of
moment estimators (MMEs) of µ and σ .
7. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density
βα β
f X ( x=
) , x > α , α > 0, β > 2. Find the method of moments estimators of α , β .
x β +1
8. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , θ ) population. Find the MME of θ .
9. Let X 1 , X 2 ,..., X n be a random sample from a lognormal population with density
1  1 
f X=
( x) exp − 2 (log e x − µ ) 2  , x > 0. Find the MMEs of µ and σ 2 .
σ x 2π  2σ 
10. Let X 1 , X 2 ,..., X n be a random sample from a double exponential ( µ , σ ) population. Find
the MMEs of µ and σ .
Hints and Solutions

1. (i) E { X ( X − 1)( X − 2)} = λ3


(ii) For this we solve estimating equation. Let T ( X ) be unbiased for e − λ cos λ .
Then
ET ( X ) = e − λ cos λ for all λ >0 .

e−λ λ x
⇒ ∑ T ( x) e − λ cos λ for all λ >0
=
x =0 x!

λx λ2 λ4
⇒ ∑ T ( x) =−
1 −  for all λ >0
+
x =0 x! 2! 4!
As the two power series are identical on an open interval, equating coefficients of
powers of λ on both sides gives

T ( x=) 0, if= x 2m + 1,
= 1,= if x 4m,
= −1, if x =4m + 2, m =0,1, 2,
(iii) For this we have to solve estimating equation. However, we use Euler’s identity
to solve it.

Let U ( X ) be unbiased for sin λ . Then


∞ λ x 1 λ iλ −iλ
⇒ ∑ U ( x) =e (e − e ) for all λ > 0
x=0 x ! 2i
1 (1 + i )λ (1 − i )λ
= (e −e ) for all λ > 0
2i
1  ∞ λ k (1 + i )k ∞ λ k (1 − i )k 
=  ∑ − ∑  for all λ > 0.
= 2i  k 0= k! k! 
 k 0 
Applying De-Moivre’s Theorem on the two terms inside the parentheses, we get

∞ λ x 1 ∞ λk  k  π π
k
k   π  π  
k

=∑ U ( x) ∑ ( 2)  cos 4 + i sin 4  − ( 2)  cos  − 4  + i sin  − 4   


x ! 2i k =0 k ! 
x=0 
1 ∞ λk  k  kπ kπ  k   kπ   kπ   
= ∑ ( 2)  cos
2i k =0 k !   4
+ i sin
4
 − ( 2)  cos  −
   4
 + i sin  −

 
 4  

( 2) k λ k  kπ 
=∑ sin   for all λ > 0
k =0 k!  4 
Equating the coefficients of powers of λ on both sides gives

πx 
=U ( x) (= 2) x sin   , x 0,1, 2,
 4 
In Parts (iv) and (v) , we can show in a similar way that estimating equations do not
have any solutions.
1 n 1 n
=

2. = =
Let X
X
n i 1=
i , and S 2

n −1 i 1
( X i − X )2 .

 σ2  (n − 1) S 2
Then X ~ N  µ ,  , and W= ~ χ n2−1 . It can be seen that
 n  σ 2

n n−2
2
E (W 1/2 ) = 2 and E (W −1/2 ) = 2 . Using these, we get unbiased estimators
n −1 n −1
2
2 2
n −1 n −1
1 n −1 2 2 2 1 respectively. As
of = σ and as T1 = S and T2
σ 2 n n −1 n − 2 S
2 2
µ
X and S 2 are independently distributed, U1 = X T2 is unbiased for . Further,
σ
U= 2 X + bT1 is unbiased for µ + bσ . As X and S 2 are consistent for µ and σ 2
µ
respectively, U1 and U 2 are also consistent for and µ + bσ respectively.
σ

3θ 2X
3. =
As µ1′ = ,T is unbiased for θ . T is also consistent for θ .
2 3

1
4. As E ( X i ) = , T1 is unbiased. Also X 1 and X 2 are independent. So
λ
2
1 π  π
( ) ( ) 1
2
=
E (T2 ) E =
X1 X 2 E=
( X1 ) =  . Var (T1 ) = 2 .
2 λ  4λ 2λ
2

(=
MS ET2)


1
E  X1 X 2 − =
2
 E ( X1 X 2 ) − E
λ λ
( )
X1 X 2 +
1
λ2
2  π
= 2 1 − 
λ  4

σ 22 − σ 12
5. The minimizing choice of α is obtained as .
σ 12 + σ 22 − 2σ 12

1  x−µ 
f ( x)= exp  −  , x > µ , σ > 0. µ1′ =µ + σ , µ2′ =(µ +σ ) +σ 2 .
2
6.
σ  σ 
So µ =µ1′ − µ2′ − µ ′2 , σ = µ2′ − µ ′2 . The method of moments estimators for
µ and σ are therefore given by
1 n 1 n
=n i 1=
µˆ MM =
ni1
X− ∑ ( X i − X )2 , an σˆ MM =
d ∑ ( X i − X )2 .
βα βα 2 µ1′ µ2′ µ2′
7.=µ1′ = , µ′ . So α = , β = 1+
β −1 2 β − 2 µ2′ − µ 1′2 µ2′ − µ1′2
The method of moments estimators for α and β are therefore given by
n n
X ∑X i
2
∑X i
2

αˆ MM = i =1
, βˆMM = 1 + n
i =1
.
∑(X
n n

∑(X ∑X − X) 2
i − X) + 2
i
2
i
=i 1 =i 1 i =1

θ2 3 n 2
8. Since µ1′ = 0 , we consider µ2′ =
3
. So θˆMM = ∑ Xi
n i =1
 µ ′2   µ′ 
9.=µ1′ e=
µ +σ 2 /2
, µ2′ e=
2 µ + 2σ 2
. So µ log
=  1 , σ
2
log  22  and the method of
 µ′   µ1′ 
 2 

moments estimators for µ and σ 2 are therefore given by

  1 n 2

X2
  n ∑ Xi 
=µˆ MM log
=   , σˆ 2
log  i =1 2  .
 1 n  MM  X 
 ∑ X i2  



 n i =1 
1  x−µ 
10. f (=
x)

exp  −
σ  , x ∈ , µ ∈ , σ > 0. µ=
′ µ , µ=
1
′ µ 2 + 2σ 2 .
2
 
1
=
So µ µ= ′
1, σ ( µ2′ − µ ′2 ) . The method of moments estimators for µ and σ are
2
therefore given by
1 n
=µˆ MM X=
, and σˆ MM ∑ ( X i − X )2 .
2n i =1
Statistical Inference
Test Set 2

1. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , 2θ ) population. Find MLE of θ .


2. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density
βα β
f X ( x=
) , x > α , α > 0, β > 2. Find the MLEs of α , β .
x β +1
3. Let X 1 , X 2 ,..., X n be a random sample from a U (−θ , θ ) population. Find the MLE of θ .
4. Let X 1 , X 2 ,..., X n be a random sample from a lognormal population with density
1  1 
f X=
( x) exp − 2 (log e x − µ ) 2  , x > 0. Find the MLEs of µ and σ 2 .
σ x 2π  2σ 
5. Let ( X 1 , Y1 ), ( X 2 , Y2 ),..., ( X n , Yn ) be a random sample from a bivariate normal population
with parameters µ1 , µ2 , σ 12 , σ 22 , ρ . Find the MLEs of parameters.
6. Let X 1 , X 2 ,..., X n be a random sample from an inverse Gaussian distribution with density
 λ   λ ( x − µ )2 
1/2

f X ( x) =  3 
exp −  , x > 0 . Find the MLEs of parameters.
 2π x   2µ 2 x 
k
7. Let ( X 1 , X 2 ,..., X k ) have a multinomial distribution with parameters n = ∑ X i ,
i =1
k
p1 ,  , pk ; 0 ≤ p1 ,  , pk ≤ 1, ∑ p j =
1, where n is known. Find the MLEs of p1 ,  , pk .
j =1

8. Let one observation be taken on a discrete random variable X with pmf p ( x | θ ) , given
below, where Θ ={1, 2,3} Find the MLE of θ .

θ
1 2 3
1 1/2 1/4 1/4
x 2 3/5 1/5 1/5
3 1/3 1/2 1/6
4 1/6 1/6 2/3

9. Let X 1 , X 2 ,..., X n be a random sample from the truncated double exponential distribution
with the density
e −| x|
= f X ( x) , | x | < θ , θ > 0.
2(1 − e −θ )
Find the MLE of θ .

10. Let X 1 , X 2 ,..., X n be a random sample from the Weibull distribution with the density
β
f X ( x) α β x β −1e −α x , x > 0, α > 0, β > 0.
=
Find MLE of α when β is known.
Hints and Solutions

1
1. The likelihood function is L(θ=
, x) , − θ < x(1) ≤ x(2) ≤  ≤ x( n ) < 2θ , θ > 0. Clearly
(3θ ) n
it is maximized with respect to θ , when θ takes its infimum. Hence,
 X 
θˆML max  − X (1) , ( n )  .
=
 2 
β nα nβ
(α , β , x )
2. The likelihood function is L= n
, x(1) > α , α > 0, β > 2. L is maximized
(∏ xi ) β +1

i =1

with respect to α when α takes its maximum. Hence αˆ ML = X (1) . Using this we can
β n {x(1) }nβ
as L′( β , x )
rewrite the likelihood function= n
, β > 2. The log likelihood is
(∏ xi ) β +1

i =1

 n 
′ x ) n log β + nβ log x(1) − ( β + 1) log  ∏ xi  . This can be easily maximized
log L ( β , =
 i =1 
−1
1 X 
with respect to β and we get βˆML =  ∑ log (i )  .
 n X (1) 
max ( − X (1) , X ( n ) ) =
3. Arguing as in Sol. 1, we get θˆML = max | X i | .
1≤i ≤ n

4. Directly maximizing the log-likelihood function with respect to µ and σ 2 , we get


1 1
= µˆ ML
n
∑ =
log X i , σˆ ML
2

n
∑ (log X i −µˆ ML ) 2 .
5. The maximum likelihood estimators are given by
1 n 1 n 1 n
µˆ=
1 X ,
=
µ
ˆ =
2 Y , σˆ =
2
1 ∑ i
ni1 =
( X − X ) 2
, σˆ =
2
2 ∑ i
n i 1=
(Y − Y ) 2
, ρ
=
ˆ ∑ ( X i − X )(Yi − Y ) / (σˆ1σˆ 2 ).
ni1

6. The maximum likelihood estimators are given by


−1
 1  n 1 1 
=µ ML X=
ˆ , λML   ∑
ˆ − 
 n  i =1 X i X  
7. The maximum likelihood estimators are given by
X1 Xk
= pˆ1 = , , pˆ k
n n
8.=θˆML 1,= if x 1, 2
= 2,= if x 3
= 3,= if x 4
max ( − X , X
9. θˆ =
ML (1) (n) )=
max | X i |.
1≤i ≤ n

n
10. αˆ ML =
∑ xiβ
Statistical Inference
Test Set 3

1. Let X 1 , X 2 ,..., X n be a random sample from a population with density function


x x2 
f ( x=
) exp −  , x > 0, θ > 0. Find FRC lower bound for the variance of an
θ  2θ 
unbiased estimator of θ . Hence derive a UMVUE for θ .

2. Let X 1 , X 2 ,..., X n be a random sample from a discrete population with mass function
1−θ 1 θ
P( X = −1) = , P( X = 0) = , P( X =1) = , 0 < θ < 1.
2 2 2
Find FRC lower bound for the variance of an unbiased estimator of θ . Show that the
1
variance of the unbiased estimator X + is more than or equal to this bound.
2
3. Let X 1 , X 2 ,..., X n be a random sample from a population with density function
f ( x) =θ (1 + x) − (1+θ ) , x > 0, θ > 0. Find FRC lower bound for the variance of an
unbiased estimator of 1/ θ . Hence derive a UMVUE for 1/ θ .

4. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density


βα β
f X ( x=
) , x > α , α > 0, β > 2. Find a sufficient statistics when (i) α is known, (ii)
x β +1
when β is known and (iii) when both α , β are unknown.

5. Let X 1 , X 2 ,..., X n be a random sample from a Gamma ( p, λ ) population. Find a sufficient


statistics when (i) p is known, (ii) when λ is known and (iii) when both p, λ are
unknown.

6. Let X 1 , X 2 ,..., X n be a random sample from a Beta (λ , µ ) population. Find a sufficient


statistics when (i) µ is known, (ii) when λ is known and (iii) when both λ , µ are
unknown.

7. Let X 1 , X 2 ,..., X n be a random sample from a continuous population with density


function
θ
=
f ( x) , x > 0, θ > 0. Find a minimal sufficient statistic.
(1 + x)1+θ

8. Let X 1 , X 2 ,..., X n be a random sample from a double exponential population with the
1 −| x −θ |
density f= ( x) e , x ∈ , θ ∈ . Find a minimal sufficient statistic.
2
9. Let X 1 , X 2 ,..., X n be a random sample from a discrete uniform population with pmf
1
p=
( x) = , x 1,  , θ , where θ is a positive integer. Find a minimal sufficient statistic.
θ
10. Let X 1 , X 2 ,..., X n be a random sample from a geometric population with pmf
(1 − p ) x −1 p , x =
f ( x) = 1, 2,  , 0 < p < 1 . Find a minimal sufficient statistic.

11. Let X have a N (0, σ 2 ) distribution. Show that X is not complete, but X 2 is complete.

12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
n
x) λ e − λ x , x > 0, λ > 0. Show that Y = ∑ X i is complete.
f (=
i =1

13. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Show that Y = X (1) is complete.
f ( x=

14. Let X 1 , X 2 ,..., X n be a random sample from a N (θ , θ 2 ) population. Show that ( X , S 2 )


is minimal sufficient but not complete.
Hints and Solutions
2
x
1. log f ( x | θ ) = log x − log θ − .

2
 ∂ lo f g  X2 1 E ( X 4 ) E ( X 2 ) 1 8θ 2 2θ 1
2
1
E  = E  2 −  = − + 2 = − 3+ 2 = 2.
 ∂θ   2θ θ  4θ θ θ 4θ θ θ θ
4 3 4

θ 2
So FRC lower bound for the variance of an unbiased estimator of θ is .
n
1 θ2
Further T =
2n
∑ X i2 is unbiased for θ and Var (T ) =
n
. Hence T is UMVUE for θ .

1 1 1 + 4θ − 4θ 2
2. E ( X )= θ − . So T= X + is unbiased for θ . Var (T ) = .
2 2 4n
(θ − 1) −1 , x = −1
∂  ∂ log f 
2
 1
=log f ( x | θ ) = 0, x 0 So we get E   =
∂θ θ −1 ,  ∂θ  2θ (1 − θ )
 x = 1
2θ (1 − θ )
The FRC lower bound for the variance of an unbiased estimator of θ is .
n
2θ (1 − θ ) (1 − 2θ ) 2
It can be seen that Var (T ) − = ≥ 0.
n 4n
∂ 1
3. We have log f ( x | θ ) =− log(1 + x) . Further, E{log(1 + X )} = θ −1 , and
∂θ θ
 ∂ log f 
2
1
E{log(1 + X )} = 2
2θ . So E 
−2
 = 2 , and FRC lower bound for the variance of
 ∂θ  θ
θ2 1
an unbiased estimator of θ −1 is
n
.= Now T
n
∑ log(1 + X i ) is unbiased for θ −1 and
θ2
Var (T ) = .
n
4. Using Factorization Theorem, we get sufficient statistics in each case as below
n
 n

(i) ∏ X i ,(ii) X (1) (iii)  X (1) , ∏ X i 
i =1  i =1 

5. Using Factorization Theorem, we get sufficient statistics in each case as below


n n
 n n

(i) ∑ X i ,(ii) ∏ X i (iii)  ∑ X i , ∏ X i 
i =1 i =1  i =1 i =1 

6. Using Factorization Theorem, we get sufficient statistics in each case as below


n n
 n n

(i) ∏ X i ,(ii) ∏ (1 − X i ) (iii)  ∏ X i , ∏ (1 − X i ) 
i =1 i =1 =  i 1 =i 1 
n
7. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is ∏ (1 + X ) .
i =1
i

8. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the order statistics


( X (1) ,  , X ( n ) ) .
9. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the largest order
statistics X ( n ) .
n
10. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is ∑X
i =1
i .

11. Since E ( X ) = 0 for all σ > 0 , but but P( X= 0)= 1 for all σ > 0 . Hence X is not
1
complete. Let W = X 2 . The pdf= e − w/2σ , w > 0 .
2
of W is fW ( w)
σ 2π w

Eσ g (W ) 0 for all σ > 0 ⇒ ∫ g ( w) w−1/2 e − w/2σ dw =
Now= 0 for all σ > 0 . Uniqueness
2

of the Laplace transform implies g ( w) = 0 a.e. Hence X 2 is complete.


n
12. Note that Y = ∑ X i has a Gamma (n, λ ) distribution. Now proceeding as in Problem 11,
i =1
it can be proved that Y is complete.

( y ) n e n ( µ − x ) , x > µ , µ ∈ .
13. Note that the density of Y = X (1) is given by fY=
Eg (Y ) 0 for all µ ∈ 
=

⇒ ∫ n g ( y )e n ( µ − y ) dy =
0 for all µ ∈ 
µ

⇒ ∫ g ( y )e − ny dy =0 for all µ ∈ 
µ

Using Lebesgue integration theory we conclude that g ( y ) = 0 a.e. Hence Y is complete.

14. Minimal sufficiency can be proved using Lehmann-Scheffe theorem. To see that
 n 
( X , S 2 ) is not complete, note that E S 2  0 for all θ > 0.
X 2 −=
 n +1 
 n 2
However, P  =
X 2 S= 0.
 n +1 
Statistical Inference
Test Set 4

1. Let X 1 , X 2 ,..., X n be a random sample from a N ( µ , σ 2 ) population. Find UMVUEs of the


µ
signal to noise ration and quantile µ + bσ , where b is any given real.
σ
2. Let X 1 , X 2 ,..., X n be a random sample from a N ( µ ,1) population. Find a UMVUE of cdf
Φ ( x − θ ) , where Φ denotes the cdf of a standard normal variable.
3. Let X ~ Bin (n, p ), where p is known. Find UMVUEs of p 2 and Var ( X ) .
4. Let X 1 , X 2 ,..., X n be a random sample from a Ρ(λ ) population. Find a UMVUE of
g (λ ) = P( X 1 ≤ 1) = (1 + λ ) e − λ .
5. Let X 1 , X 2 ,..., X n be a random sample from a Geo ( p ) distribution. Find a UMVUE of
P( X 1= 1)= p .
6. Let X 1 , X 2 ,..., X n be a random sample from a Gamma ( p, λ ) population, where p is known.
Find a UMVUE of λ m and λ − r , where m and r are positive integers.
7. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Find UMVUEs of µ and µ 2 .
f ( x=
8. Let X 1 , X 2 ,..., X n be a random sample from Exp ( µ , σ ) population. Find UMVUEs of a
quantile µ + bσ and reliability function =
R(t ) P( X 1 > t ) .
9. Let X 1 , X 2 ,..., X n be a random sample from a N (0, σ 2 ) population. Find the best scale
equivariant estimators of σ 2 and σ with respect to scale invariant loss functions.
10. Let X 1 , X 2 ,..., X n be a random sample from Exp ( µ , σ ) population. Find best affine
equivariant estimator of θ= µ + ησ with respect to an affine invariant loss function.
11. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with density
) θ e −θ x , x > 0, θ > 0. Find Bayes estimator of θ with respect to the prior
f ( x | θ=
g (θ ) e −θ , θ > 0. The loss functions are L1 (θ , a=
= ) (θ − a ) 2 , L2 (θ , a=
) (θ − a ) 2 / θ 2 and
L3 (θ , a=
) (θ − a ) 2 / a .
12. Let X 1 , X 2 ,..., X n be a random sample from a U (0, θ ) population. Find Bayes estimator of θ
αβ α
g (θ )
with respect to the prior= , θ > β . The loss function is L(θ , a=
) (θ − a ) 2 .
θ α +1
Hints and Solutions

1. As ( X , S 2 ) is complete and sufficient, one can use Rao-Blackwell-Lehmann-Scheffe


Theorem to show that estimators U1 and U 2 as defined in Hints and Solutions in Test Set 1
µ
are UMVUEs of and µ + bσ respectively.
σ
2. Since X is complete and sufficient, using Rao-Blackwell-Lehmann-Scheffe Theorem, we
conclude that h( X ) is a UMVUE of Φ ( x − θ ) , where h( x ) = P ( X 1 ≤ x | X = x ) . The
 n −1 
conditional distribution of X 1 | X = x is N  x,  . It can be then shown that
 n 
 n 
h( x ) =
Φ  ( x − x )  .
 n −1 
X ( X − 1) X (n − x)
3. =
UMVUEs of p 2 and Var ( X ) are respectively given by T1 = and T2 .
n(n − 1) n −1
4. Let=
T ( X 1 ) 1,=
if X 1 0 or 1 .
= 0, otherwise.
n
Then T ( X 1 ) is unbiased for g (λ ) . Note that S = ∑ X i is complete and sufficient statistic.
i =1

So using Rao-Blackwell-Lehmann-Scheffe Theorem, h( S ) = E (T ( X 1 ) | S ) is a UMVUE of


g (λ ) . Now h( s ) =
P( X 1 =
0 | S =+
s) P( X 1 =
1| S =
s) .
n

P(=
X 1 0,=
S s)
= ∑ X i s)
P( X 1 0,=
P( X= 0 | S= s=
) = i =2
= P( S s=
1
) P( S s)
n
Using independence of X 1 and ∑X
i =2
i and the additive property of Poisson distribution, we

 n −1 
s

get the above expression as   . In a similar way, we get


 n 
s (n − 1) s −1 (n − 1) S ( S + n − 1)
P( X=1 1| S= s=
) . Thus h( S ) = .
ns nS
5. Let=T ( X 1 ) 1,= if X 1 0
= 0, otherwise.
n
As in Qn. 5, h( S ) is a UMVUE of p , where h=
( s ) P(= | S s ) and S = ∑ X i is a
X 1 0=
i =1
complete and sufficient statistic. The distribution of S is negative binomial (n, p ) .
n −1
Proceeding as in Qn. 5, we get h( S ) = .
S −1
n
6. A complete and sufficient statistic is T = ∑ X i . Also T ~ Gamma (np, λ ) . We have
i =1

 np − m − m   np 
E  T  = λ m , np > m and E  λ −r
T r  =
 np   np + r 

7. is f ( y ) ne n ( µ − y ) , y > µ .
A complete and sufficient statistic is Y = X (1) . The density of Y=
1 2µ 2
We have E (Y ) =µ + and E (Y 2 ) =µ 2 + + 2 . Using these UMVUEs of µ and µ 2 are
n n n
1 2Y
Y− and Y 2 − .
n n
n
8. =
Let =
Y X (1) and Z ∑(X
i =1
i − Y ) . Then (Y , Z ) is complete and sufficient. Also, Y and Z are

 σ 2Z
independently distributed with Y ~ Exp  µ ,  and ~ χ 22n − 2 . Using these, UMVUEs of
 n σ
Z Z
µ and σ are given by d1 = Y− and d 2 = respectively. So a UMVUE for
n(n − 1) n −1
quantile is d1 + bd 2 .
A UMVUE for R (t ) is h(Y=
, Z ) P ( X 1 > t | (Y , Z )) .

n−2
n −1   t − Y 
=
It can be seen that h(Y , Z )  max 1 − , 0  .
n   Z 
T
9. Note that T = ∑ X i2 is a complete and sufficient statistic. Also W = ~ χ n2 . Let the loss
σ2
2
σ 2 −a  σ −b 
2

functions for estimating σ = and σ be L1 (σ , a ) =


2 2
 and L2 (σ , b)  
 σ   σ 
2

respectively. Clearly the two estimation problems are invariant under the scale group of
transformations, =
GS {g c : g c=
( x) cx, c > 0} on the space of X i s . Under the transformation
g c , note that σ 2 → c 2σ 2 , a → c 2 a, σ → cσ , b → cb. The form of a scale equivariant
estimator of σ 2 is d k (T ) = kT , where k is a positive constant. Minimizing the risk function
σ 2 E (T )
nσ 4 1 T
of d k with respect to k , we=
get k = = . Hence the best
E (T ) n(n + 2)σ
2 4
n+2 n+2
scale equivariant estimator of σ 2 . Similarly, the form of a scale equivariant estimator of σ
is U p (T ) = pT 1/2 , where p is a positive constant. Minimizing the risk function of U p with
n +1 2 n +1 n +1
σ E (T 1/2 ) 2 σ
=
respect to p , we get p = = 2 2 . So 2 T 1/2 is the best
E (T ) n n+2 n+2
nσ2 2 2
2 2 2
scale equivariant estimator of σ .

θ − a 
2

10. We follow the notation of Qn 8. Let the loss function be L( µ , σ , a ) =   . The


 σ 
estimation problem is invariant under the affine group of transformations,
GA ={gb ,c : g b ,c ( x) =bx + c, b > 0, c ∈ } on the space of X i s . Under the transformation gb ,c ,
note that µ → bµ + c, σ → b σ , θ → bθ + c, a → ba + c, Y → bY + c, Z → bZ . The form of
an affine equivariant estimator of θ is d k (Y , Z )= Y + kZ , where k is a constant. Minimizing
E (θ − Y ) Z E (θ − Y ) EZ
the risk function of d k with respect to k , we= get kˆ =
E (Z 2 ) E (Z 2 )
 σ  1
 µ + ησ − µ −  (n − 1)σ (n − 1) η − 
= n  n
. So the best affine equivariant estimator of θ
n(n − 2)σ 2
n(n − 2)
is d k̂ .
 n

11. The joint pdf of X = ( X 1 , X 2 ,..., X n ) is f ( x | θ=
) θ n exp −θ

∑ x  , x
i =1
i i > 0, θ > 0.

  n 
) θ n exp −θ  ∑ xi + 1  , xi > 0, θ > 0.
The joint pdf of X and θ is f *( x , θ=
  i =1 
n +1
The marginal density of X =
is then h( x ) , xi > 0.
(nx + 1) n +1
Hence the posterior density of θ given X = x is Gamma (n + 1, nx + 1) .
Note that
n +1 (n + 1)(n + 2)  1  nx + 1  1  (nx + 1)
2
E (θ | x ) =
= , E (θ 2 | x ) = , E  x  = , E  2 x .
nx + 1 (nx + 1) 2 θ  n θ  n(n − 1)
n +1
With respect to the loss function L1 , the Bayes estimator of θ is E (θ | X ) = .
nX + 1
1 
E X 
θ  n −1
With respect to the loss function L2 , the Bayes estimator of θ is  = .
 1  nX + 1
E 2 X 
θ 
With respect to the loss function L3 , the Bayes estimator of θ is
(n + 1)(n + 2)
{E (θ | X )}
1/2
2
= .
(nX + 1)
1
12. The joint pdf of X = ( X 1 , X 2 ,..., X n ) is f ( x | θ )= , 0 < x(1) <  < x( n ) < θ .
θn
αβ α
*( x , θ )
The joint pdf of X and θ is f= , θ > max{β , x( n ) }.
θ n +α +1
αβ α
=
The marginal density of X is then h( x ) n +α
, x( n ) > 0.
(n + α )  max{β , x( n ) }
Hence the posterior density of θ given X = x is
n +α
(n + α )  max{β , x( n ) }
=g *(θ | x ) , θ > max{β , x( n ) } .
θ n +α +1
With respect to the loss function L, the Bayes estimator of θ is
n +α
E (θ | X ) = max{β , X ( n ) } .
n + α −1
Statistical Inference
Test Set 5

1. In the following hypotheses testing problems identify the given hypotheses as simple or
composite.
(i) X ~ Exp (λ ), H 0 : λ ≤ 1
(ii) X ~ Bin(n, p ), n is known, H 0 : p = 1/ 3.
(iii) X ~ Gamma (r , λ ), r is known , H 0 : λ > 2 .
(iv) X 1 , , X n ~ N ( µ , σ 2 ), H=
0 :µ σ 2 2.
0,=

2. Let X ~ P(λ ). For testing = H 0 : λ 1= vs. H 0 : λ 4 consider the test


φ ( x) =>
1, if x 2; and 0, if x ≤ 2. Find the probabilities of type I and type II errors.
1 −| x|/σ
3. Let X have double exponential density f= X ( x) e , x ∈ , σ > 0. For the testing

= H 0 : σ 1 vs. H 0 : σ > 1 consider the test function φ ( x) =1, if | x | > 1; and 0, if | x | ≤ 1. Find
the size and the power of the test. Show that power is always more than the size of the test.

σ
4. Let X have Cauchy density=
fσ ( x ) , x ∈ , σ > 0. Find the most powerful test
π (σ + x 2 )
2

of size α for the testing


= H 0 : σ 1= vs. H 0 : σ 2 .
2
5. Let X have density fθ (= x) (θ − x), 0 < x < θ . Find the most powerful test of size α
θ2
=
for the testing H 0 : θ 1= vs. H 0 : θ 2 .
6. Let X 1 , , X n be a random sample from a population with density
1 θ −1 − x
=
fθ ( x) x e , x > 0, θ > 0. Show that the family has monotone likelihood ratio in
θ
∏ X j . Hence derive UMP test of size α for testing H 0 : θ ≤ 3 vs. H 0 : θ > 3 .
7. Let X 1 , , X n be a random sample from a population with beta density
θ +4 θ
=
fθ ( x) x (1 − x) 2 , 0 < x < 1, θ > 0. Show that the family has monotone likelihood
2 θ +1
ratio in ∏ X j . Hence derive UMP test of size α for testing H 0 : θ ≥ 2 vs. H 0 : θ < 2 .

8. Based on a random sample of size n from Exp (λ ) population, derive UMP unbiased test of
size α for testing
= H 0 : λ 1 vs. H 0 : λ ≠ 1.
Based on a random sample of size n from double exponential population with density
9.
1 −| x|/σ
f=
X ( x) e , x ∈ , σ > 0 derive UMP unbiased test of size α for testing

H 0 : σ 1 vs. H 0 : σ ≠ 1.
=
10. =
For the set up in Q. 5, find the LRT for the testing H 0 : θ 2 vs. H 0 : θ ≠ 2 .

11. Derive LRT test for the testing problem in Q. 9.

12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Find the LRT of size α for testing H 0 : µ ≤ 1 vs H1 : µ > 1.
f ( x=

13. Find out the group of transformations under which the following testing problems are
invariant:
(i) X ~ eθ − x , θ > x, H 0 : θ ≥ 3 vs. H1 : θ < 3
(ii) X ~ Exp (1/ σ ), H 0 : σ ≤ 1 vs. H1 : σ > 1

14. Let X 1 , X 2 ,..., X n be a random sample from an inverse Gaussian distribution with density
 λ   λ ( x − µ )2 
1/2

f X ( x) =  3 
exp  −  , x > 0 . Find the confidence intervals for the
 2π x   2µ 2 x 
parameters.

15. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with


β
βα
density f X ( x=
) , x > α , α > 0, β > 2. Find the confidence intervals for α , β .
x β +1
Hints and Solutions

1.
(i) composite (ii) simple (iii) composite (iv) simple
5
=2. α =Pλ 1 =
( X > 2) =1 − e −1 , β =Pλ 4 ( X ≤ 2) =13e −4
2
α P(| X | >=
3. = 1) e . Power = Pσ (| X | > 1)= e −1/σ > e −1 , as σ > 1 .
−1

f 2 ( x)
4. Using NP Lemma, the most powerful test is to reject H 0 when =
R( x) > k . Now
f 2 ( x)
2(1 + x 2 ) 6x 1
R( x) = . It can be seen that R′( x) = . So R ( x) has a minimum at x = 0
(4 + x )
2
(4 + x )
2 2
2
and supremum 2 as x → ±∞. The most powerful test can then be designed as below:
1
(i) If we take k ≤ , then the MP test will always reject H 0 and α = 1 .
2
(ii) If we take k ≥ 2, then the MP test will always accept H 0 and α = 0 .
1
(iii) If we take < k < 2, then the MP test will reject H 0 when R( x) > k . This is equivalent
2
(4k − 2) 4
to | x | > . Applying the size condition, we get k = .
(2 − k ) 4 + 3cos(1 − α )

5. Using NP Lemma, we find that the MP test will reject H 0 when x < 1 − 1 − α .

6. The solution will follow from the use of MLR property.

7. The solution will follow from the use of MLR property.


n
8. The test will be based on ∑X
i =1
i which has a Gamma (n, λ ) distribution.
n
9. The test will be based on ∑| X
i =1
i | which has a Gamma (n,1/ σ ) distribution.

10. The test will reject H 0 when | X − 1| > k .

11. Use property in Q. 9.

12. The test will reject H 0 when X (1) > 1 − ln(α )1/ n .

13. (i) translation group (ii) scale group

14. We can use the complete sufficient statistics and their distributions to formulate the
confidence intervals.

15. Same method as in Q. 14.


Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 6
Answer for selected Problems

1. (a) No such X exists. (b)X = 0


4. X = 0

5. n ≥ 1701.5625
6. 1 − φ(0.913) = 0.1814
7. n ≥ 132.45
9. Φ(1.3093)

10. (a) 1 − φ(4.93) ≈ 0 (b) 0.003599


11. 1 − Φ(2.9277)

20
12. p = 44/49, n ≈ 100

9-
1
13. (a) P (Y ≥ 900) ≤ 9 (b) P (Z ≥ 2) = 1 − Φ(2) = 0.0228

14. Φ(0) = 0.5


15. (a) P (Yi = 0) = 19 , P (Yi = 2) = 89 , (b) 1 − Φ(−5.53398)
q
16. x = 2(n+1)n
(2t − 1)
2 01
er
17. P oisson(λ)
t
es
em
IS

1
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 6 (Limiting Probabilities)
1. Let {Xn } be a sequence of independent random variables defined by
1 1
P {Xn = 0} = 1 − , and P {Xn = 1} = , n = 1, 2, . . . .
n n
a.s
(a) Find the distribution of X such that Xn −→ X
p
(b) Find the distribution of X such that Xn −→ X

2. For each n ≥ 1 , let Xn be an uniformly distributed random variable over set 0, n1 , n2 , · · · , n−1

n , 1 . Prove
that Xn convergence to U [0, 1] in distribution.
3. Let (Ω, =, P ) = ([0, 1] , B (R) ∩ [0, 1] , U ([0, 1])). Let {Xn , n = 2, . . .} be a sequence of random variables with
d d
Xn = U 21 − n1 , 12 + n1 . Prove or disprove that Xn −→ X with X = 12 .
 

Pn
4. Let X1 , X2 , . . . be a sequence of i.i.d. random variables such that Xi ∼ N (0, 1). Define Sn = i=1 Xi , n =

20
1, 2, . . .. Then, as n → ∞, Snn converges in probability to X. Find X?

5. Consider polling of n voters and record the fraction Sn of those polled who are in favour of a particular

9-
candidate. If p is the fraction of the entire voter population that supports this candidate, then Sn =
X1 +X2 +...+Xn
n , where Xi are i.i.d. random variables with B(1, p). How many voters should be sampled so
2 01
that we wish our estimate Sn to be within 0.02 of p with probability at least 0.90?
6. Suppose that 30 electronic devices say D1 , D2 , . . . , D30 are used in the following manner. As soon as D1
fails, D2 becomes operative. When D2 fails, D3 becomes operative etc. Assume that the time to failure of
Di is an exponentially distributed random variable with parameter = 0.1(hour)−1 . Let T be the total time
of operation of the 30 devices. What is the probability that T exceeds 350 hours?
er

7. Let X ∼ Bin (n, p). Use the CLT to find n such that: P [X > n/2] ≤ 1 − α. Calculate the value of n when
t

α= 0.90 and p =0.45.


es

n
X ni
8. Use CLT to show that limn→∞ e−n ' 0.5.
em

i=0
i!

9. A person puts few one rupee coins into a piggy-bank each day. The number of one rupee coins added on any
given day is equally likely to be 1, 2, 3, 4, 5 or 6, and is independent from day to day. Find an approximate
IS

probability that it takes at least 80 days to collect 300 rupees? Final answer can be in terms of Φ(z) where
Rz 2
Φ(z) = √12π −∞ e−t /2 dt.

10. Suppose that Xi , i = 1, 2, . . . , 30 are independent random variables each having a Poisson distribution with
parameter 0.01. Let S = X1 + X2 + . . . + X30 .
(a) Using central limit theorem evaluate P (S ≥ 3).
(b) Compare the answer in (a) with exact value of this probability.
7 2
P P (Xi = 1) = 9 = 1 − P (Xi = 0). Let Yi = Xi + Xi ,
11. Let X1 , X2 , . . . be iid random variables, each having pmf
30
i = 1, 2, . . .. Use central limit theorem to evaluate P i=1 Yi > 60 approximately. Final answer can be in
z 2
terms of Φ(z) where Φ(z) = √12π −∞ e−t /2 dt.
R

12. Consider the dinning hall of Aravali Hostel, IIT Delhi which serves dinner to their hostel students only. They
are seated at 12-seat tables. The mess secretary observes over a long period of time that 95 percent of the
time there are between six and nine full tables of students, and the remainder of the time the numbers are
equally likely to fall above or below this range. Assume that each student decides to come with a given
probability p, and that the decisions are independent. How many students are there? What is p?

1
13. Let X1 , X2 , . . . be a sequence of independent and identically distributed random
P100 variables with mean 1 and
variance 1600, and assume that these variables are non-negative. Let Y = k=1 Xk .
(a) What does Markov’s inequality tell you about the probability P (Y ≥ 900).
(b) Use the central limit theorem to approximate the probability P (Y ≥ 900). Final answer can be in terms
Rz 2
of Φ(z) where Φ(z) = √12π −∞ e−t /2 dt.

14. A person stands on the street and sells newspapers. Assume that each of the people passing by buys a
newspaper independently with probability 31 . Let X denote the number of people passing past the seller during
the time until he sells his first 100 copies of the newspaper. Using CLT, find P (X ≤ 300) approximately.
Rz t2
Final answer can be in terms of φ(z) where φ(z) = √12π −∞ e− 2 dt.

15. Let X1 , X2 , . . . be iid random variables, each having Bernoulli distribution with parameter 8/9.
(a) Find the distribution of Yi = Xi + Xi2 , i = 1, 2, . . ..
P 
20
(b) Use central limit theorem to evaluate P i=1 Yi > 20 approximately. Final answer can be in terms
Rz 2
of Φ(z) where Φ(z) = √12π 0 e−t /2 dt.

20
16. Let X1 , X2 , . . . , Xn be n independent Poisson distributed random variables with means 1, 2, . . . , n respectively.

9-
Find an x in terms of t such that
2
!
Sn − n2

where Φ is the CDF of N (0, 1).


P
n 2 01
≤ t ≈ Φ(x), for sufficiently large n

17. Using MGF, find the limit of Binomial distribution with parameters n and p as n → ∞ such that np = λ so
er
that p → 0.
t
es
em
IS

2
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 5
Answer for selected Problems

1. (a) 1/81 (b) exp(9) (c) 61/180


2. E(X) = bE(Y ) + a

3. (a)f (t) = 2−3/2 (1 + t2 /2)−3/2 , t ∈ (−∞, ∞) (b) E(T ) = 0, V ar(T ) = ∞


5. 12
p
6. c = 3/2, E(T ) = 0
 1
 10 , 0 < y < 5,
1
7. (a) Let Y : r.v. denoting the waiting time of passenger. fY (y) = 20 , 5 < y < 15,
0, otherwise.

25
(b) minutes.

20
4
5
8. 2 log(3/2)

9-
9. 31
σ2
10. (a) (b) σ 2

11. X2
3
n

12. E(Y /x) = 2(1 + x), x ≥ 0


2 01
er
13. Regression of X on Y is
a+y
E(X/y) = n+a+b , y = 0, 1, . . . , n.
Yes.
t
es

1 1
14. E[X/X > y] = λ + y; E[X − y/X > y] = λ

15. X2 ∼ Binomial(n,
 p2 ),
em

1−(1−q)n−1
np 1−(1−q)n ; p = q = 1/3
k
x 1
17. E(Y k /x) = k+1 , E(Y k ) = (k+1)
IS

 
y σ2 y
18. X/y ∼ N 1+σ 2 , 1+σ 2 , E(X/y) = 1+σ 2

19. 1/2
1 t t2
20. P (n) (t) = P (P (...(P (t))...)) where P (t) = 4 + 4 + 2. PZn (t) = [P (n) (t)]Zn . E(Z51 ) = 1250.

21. MSN (t) = MN (log(MX (t)))


pMX (t) 1

23. (a) MY (t) = 1−(1−p)MX (t) where MX (t) = 3 1 + et + e2t b) E[Y ] = 3

e−y+x , 0 < x < y < ∞



24. (a) f (y/x) =
0, otherwise
(b)E(Y /X = x) = (1 + x), x > 0

1
Pn Pn
25. µ = 2α i=1 i2 , σ 2 = α2 i=1 i2
1 2
E(W ) = eµ+ 2 σ
2 2
V ar(W ) = e2µ+σ (eσ − 1)
(ln w−µ)2
f (w) = √1
w 2πσ
e− 2σ 2 , w>0

26. E(XY ) = E(X|Y )E(Y )


1
27. P (X = x) = 2x+1 , x = 0, 1, 2, . . .

28. Yes, X and Y are independent.


29. (a) e−0.4
32. No, use Chebyshev’s inequality

20
9-
2 01
er
t
es
em
IS

2
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 5 (Moments)
1. Let X1 and X2 be independent exponential distributed random variables with parameters 5 and 4 respectively.
Define X(1) = min{X1 , X2 } and X(2) = max{X1 , X2 }.
(a) Find V ar(X(1) )? (b) Find the distribution of X(1) ? (c) Find E(X(2) )?

2. Let X and Y be two non-negative continuous random variables having respective CDFs FX and FY . Suppose
that for some constants a & b > 0, FX (x) = FY x−a
b . Determine E(X) in terms of E(Y ).
3. Let X be a random variable having an exponential distribution with parameter 12 . Let Z be a random
variable having a normal distribution with mean 0 and variance 1. Assume that, X and Z are independent
random variables. (a) Find the pdf of T = √ZX . (b) Compute E(T ) and V ar(T ).
2

4. Let X and Y be two identically distributed random variables with Var(X) and Var(Y ) exist. Prove or
disprove that Var X+Y

2 ≤ Var (X).
4
5. Let X and Y be i.i.d. random variables each having a N (0, 1). Calculate E[(X + Y ) /(X − Y )].

20
c(X1 − X2 )
6. Let X1 , . . . , X5 be a random sample from N (0, σ 2 ). Find a constant c such that Y = p has

9-
X32 + X42 + X52
a t-distribution. Also, find E(Y ).

01
7. Consider the metro train arrives at the station near your home every quarter hour starting at 5:00 AM. You
walk into the station every morning between 7:10 and 7:30 AM, with the time in this interval being a uniform
random variable, that is U ([7 : 10, 7 : 30]).
2
(a) Find the distribution of time you have to wait for the first train to arrive?
er
(b) Also, find its mean waiting time?
t

X

8. Let X and Y be iid random variables each having uniform distribution (2,3). Find E Y ?
es

1
9. Let X and Y be two random variables such that ρ(X, Y ) = 2, V ar(X) = 1 and V ar(Y ) = 4. Compute
V ar(X − 3Y ).
em

1
Pn
10. Let X1 , X2 , . . . , Xn be iid random variables with E(X1 ) = µ and V ar(X1 ) = σ 2 . Define X = n i=1 Xi , S 2 =
1
Pn 2
n−1 i=1 Xi − X . Find (a) V ar(X) (b) E[S 2 ].
IS

11. Pick the point (X, Y ) uniformly in the triangle {(x, y) | 0 ≤ x ≤ 1 and 0 ≤ y ≤ x}. Calculate E[(X −Y )2 /X].
( y
y − 1+x
(1+x)4 e , x, y ≥ 0
12. Find E(Y /x) where (X, Y ) is jointly distributed with joint pdf f (x, y) =
0, otherwise.
1
13. Let X have a beta distribution i.e., its pdf is fX (x) = β(a,b) xa−1 (1 − x)b−1 , 0 < x < 1 and Y given X = x
has binomial distribution with parameters (n, x). Find regression of X on Y. Is regression linear?
14. Let X ∼ EXP (λ). Find E[X/X > y] and E[X − y/X > y].

15. Consider n independent trials, where each trial results in outcome i with probability pi = 1/3, i = 1, 2, 3. Let
Xi denote the number of trials that result in outcome i amongst these n trials. Find the distribution of X2 .
Find the conditional expectation of X1 given X2 > 0. Also determine cov (X1 , X2 | X2 ≤ 1).
16. (a) Show that cov(X, Y ) = cov(X, E(Y | X)).
(b) Suppose that, for constants a and b, E(Y | X) = a + bX. Show that b = cov(X, Y )/V ar(X).

1
17. Let X be a random variable which is uniformly  distributed over the interval (0, 1). Let Y be chosen from
1/x, 0 < y ≤ x
interval (0, X] according to the pdf f (y/x) = Find E(Y k /X) and E(Y k ) for any fixed
0, otherwise.
positive integer k.
18. Suppose that a signal X, standard normal distributed, is transmitted over a noisy channel so that the received
measurement is Y = X +W , where W follows normal distribution with mean 0 and variance σ 2 is independent
of X. Find fX/y (x/y) and E(X | Y = y).
19. Suppose X follows Exp(1). Given X = x, Y is a uniform distributed rv in the interval [0, x]. Find the value
of E(Y ).
20. Consider Bacteria reproduction by cell division. In any time t, a bacterium will either die (with probability
0.25), stay the same (with probability 0.25), or split into 2 parts (with probability 0.5). Assume bacteria act
independently and identically irrespective of the time. Write down the expression for the generating function
of the distribution of the size of the population at time t = n. Given that there are 1000 bacteria in the
population at time t = 50, what is the expected number of bacteria at time t = 51.
21. Let N be a positive integer random variable and X1 , X2 , . . . be a sequence of iid random variables. N is

20
independent of Xi ’s. Find the moment generating function (MGF) of SN = X1 + X2 + . . . + XN , the random
sum in terms of MGF of Xi0 s and N . Also show that:
(a) E[SN ] = E[N ]E[X] (b) V ar[SN ] = E[N ]V ar[X] + [E[X]]2 V ar[N ].

9-
22. If E[Y /X] = 1, show that V ar[XY ] ≥ V ar[X].

01
23. Suppose you participate in a chess tournament in which you play until you lose a game. Suppose you are a
very average player, each game is equally likely to be a win, a loss or a tie. You collect 2 points for each win,
2
1 point for each tie and 0 points for each loss. The outcome of each game is independent of the outcome of
every other game. Let Xi be the number of points you earn for game i and let Y equal the total number of
er
points earned in the tournament. Find the moment generating function MY (t) and hence compute E(Y ).
 −y
e , 0<x<y<∞
24. Let (X, Y ) be two-dimensional random variable with joint pdf is given by f (x, y) =
t

0, otherwise
es

(a) Find the conditional distribution of Y given X = x.


em

(b) Find the regression of Y on X.


(c) Show that variance of Y for give X = x does not involve x.
25. Let X1 , X2 , . . . , Xn be independent and ln(Xi ) has normal distribution N (2i, 1), i = 1, 2, . . . , n. Let W =
IS

X1α X22α . . . Xnnα , α > 0 where α is any constant. Determine E(W ), V ar(W ) and the pdf of W .
26. Let (X, Y ) be a two-dimensional continuous type random variables. Assume that, E(X), E(Y ) and E(XY )
are exist. Suppose that, E(X | Y = y) does not depend on y. Find E(XY ).
27. For each fixed λ > 0, let X be a Poisson distributed random variable with parameter λ. Suppose λ itself is
a random variable following exponential distribution with parameter 1. Find the probability mass function
of X.
28. Let X and Y be two discrete random variables with
P (X = x1 ) = p1 , P (X = x2 ) = 1 − p1 , 0 < p1 < 1;
and
P (Y = y1 ) = p2 , P (Y = y2 ) = 1 − p2 , 0 < p2 < 1.
If the correlation coefficient between X and Y is zero, check whether X and Y are independent random
variables.

2
29. Suppose the length of a telephone conversation between two persons is a random variable X with cumulative
distribution function 
0, −∞ < t < 0
P (X ≤ t) = ,
1 − e−0.04t , 0 ≤ t < ∞
where the time is measured in minutes.
(a) Given that the conversation has been going on for 20 minutes, compute the probability that it continues
for at least another 10 minutes.
(b) Show that, for any t > 0, E(X/X > t) = t + 25.
30. A real function g(x) is non-negative and satisfies the inequality g(x) ≥ b > 0 for all x ≥ a. Prove that for a
random variable X if E(g(X)) exists then P (X ≥ a) ≤ E(g(X))b .
λ
31. Let X have a Poisson distribution with mean λ ≥ 0, an integer. Show that P (0 < X < 2(λ + 1)) ≥ λ+1 .

32. Does the random variable X exist for which P [µ − 2σ ≤ X ≤ µ + 2σ] = 0.6? Justify your answer.

20
9-
2 01
t er
es
em
IS

3
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 4
Answer for selected Problems

1. (a) px (1) = 0.2, px (3) = 0.5 px (4) = 0.3 py (1) = 0.4, py (2) = 0.6 (b) 0.5
1 9 1
2. (a) 36 (b) 36 (c) 2

4. k = 18 , 5
81 , 1
2y2 √
5. (a) k = 4 (b) fY1 ,Y2 (y1 , y2 ) = y1 , 0 < y1 < 1, 0 < y2 < y1 < 1

e−λ(1−p) (λ(1−p))n
6. Let Y : r.v. denoting no. of 1’s transmitted. P (Y = n) = n! , n = 0, 1, . . .
35 −15 1
7. (a) 45 (b) 20

8. Yes

20
λµ
9. (µ+ν)(λ+µ+ν)

10. 0.3214

9-
11. 0.27

12. (a) PX (j) = qpj−1 , j = 1, 2, . . . PY (k) = q k

PX/Y (j/k) = (p/q)j (1−(p/q))


(p/q)−(p/q)k
,
p
q−
2 ( pq )
1− p
q
k


01
, k = 2, 3, . . .

j = 1, 2, . . . , k − 1 PY /X (k/j) = q k−j−1 p, k = j + 1, j + 2, . . .

(b) X ∼ B(1/2, 15), Y ∼ B(1/3, 15), Y /(X = j) ∼ B(2/3, 15 − j), X/(Y = k) ∼ B(3/4, 15 − k)
er

13. k = 12, (1 − e−8 )(1 − e−3 )


t


es

 z , 0<z<1
14. Z = X + Y, fZ (z) = 2 − z, 1 ≤ z ≤ 2
0, otherwise

em

 λ λz
15. fz (z) = 2e , z<0
λ −λz
2 e , z ≥ 0.
IS

16. fV (v) = − ln(v), 0 < v < 1; fW (w) = 2√1w , 0 < w < 1; fV,W (v, w) = fV (v)fW (w)

r, 0 < θ < π/4, 0 < r < sec θ or π/4 < θ < π/2, 0 < r < cosecθ
17. fR,θ (r, θ) =
 1 0, otherwise
2
 2 sec θ, 0 < θ < π/4
fθ (θ) =
 1 2
2 cosec θ, π/4 < θ < π/2
 π
 2 r, 0<r<1
fR (r) = √
r(cosec−1 (r) − sec−1 (r)), 1 < r < 2

1
18. 8

19. fX,Y (x, y) = 1/2, 1 ≤ x ≤ 1, −1 ≤ y ≤ 1.

z(1 − e−z )

0≤z<1
20. P (Z ≤ z) = .
(1 − e−1 )(e−1 − e−z ) 1 ≤ z < ∞

1
5
21. 9

22. fX (x) = exp−x, x > 0.


5 1
23. 36 + 12 ln 4
1 3 − 31 1
24. (a) 3
8 (b) 38 e− 3 (c) 8e + 58 e− 5
5
25. 9
1
27. P (X = x) =x+n−1 Cx 2x+n , x = 0, 1, 2, 3, . . .
 
1
28. Geometric 4×105

29. 0.7222
30. P (Z ≤ z) = (1 − e−z ), 0 ≤ z < ∞
 
31. N µ2 + ρ σσ12 (x − µ1 ), σ22 (1 − ρ2 )

20
3 3

32. (a) 2 × 5
36
(b) 125

9-
2 01
t er
es
em
IS

2
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 4 (Random Vector)
1. Let X and Y be independent random variables. The range of X is {1, 3, 4} and the range of Y is {1, 2}. Partial
information on the probability mass function is as follows: pX (3) = 0.50; pY (2) = 0.60; pX,Y (4, 2) = 0.18 .
(a) Determine pX , pY and pX,Y completely. (b) Determine P (|X − Y | ≥ 2).

2. Let (X, Y ) be a two-dimensional discrete type random variables with joint pmf p(x, y) = cxy for x = 1, 2, 3
and y = 1, 2, 3 and equals zero otherwise. (a) Find c. (b) Find P (1 ≤ X ≤ 2, Y ≤ 2). (c) P (Y = 3).

0, x < 0, y < 0 or x + y < 1
3. Show that F (x, y) = is not a distribution function.
1, otherwise

kx(x − y), 0 < x < 2, −x < y < x
4. Suppose that (X, Y ) has joint pdf fXY (x, y) = . Find k. Evaluate
0, otherwise
1 3
P (X < 1/Y = 2 ) and P (Y < 2 /X = 1).

Kx1 x2 , 0 < x1 < 1, 0 < x2 < 1
5. Let (X1 , X2 ) be a random vector with the joint pdf f (x1 , x2 ) =
0, otherwise

20
(a) Find the value of K? (b) Define Y1 = X12 and Y2 = X1 X2 . Find the joint pdf of (Y1 , Y2 )

9-
6. Consider a transmitter sends out either a 0 with probability p, or a 1 with probability (1 − p), independently
of earlier transmissions. Assume that the number of transmissions within a given time interval is Poisson

01
distributed with parameter λ. Find the distribution of number of 1’s transmitted in that same time interval?
7. Let X1 , X2 , . . . , X5 be iid random variables each having uniform distributions in the interval (0, 1).
2
(a) Find the probability that min(X1 , X2 , . . . , X5 ) lies between (1/4, 3/4).
er
(b) Find the probability that X1 is the minimum and X5 is the maximum among these random variables?

8. Let X1 , X2 and X3 be iid random variables each having the probability density function
t
es

 −x
e , 0<x<∞
f (x) =
0, otherwise
em

X1 +X2 X1
Find the joint distribution of (Y1 , Y2 , Y3 ) where Y1 = X1 + X2 + X3 , Y2 = X1 +X2 +X3 and Y3 = X1 +X2 . Are
Y1 and Y2 independent random variables? Justify your answer.
IS

9. Let X, Y, Z be independent exponential distributed random variables with parameters λ, µ, ν respectively.


Find P (X < Y < Z).
10. Amit and Supriya work independently on a problem in Tutorial Sheet 5 of Probability and Stochastic Pro-
cesses course. The time for Amit to complete the problem is exponential distributed with mean 5 minutes.
The time for Supriya to complete the problem is exponential distributed with mean 4 minutes. Given that
Amit requires more than 1 minutes, what is the probability that he finishes the problem before Supriya?
11. Let X and Y be independent random variables with X ∼ B(2, 13 ) and Y ∼ B(3, 12 ). Find P (X = Y ).
12. Evaluate all possible marginal and conditional distributions if (X, Y ) has the following joint probability mass
function
(a) P (X = j, Y = k) = q k−j pj , j = 1, 2, . . . and k = j + 1, j + 2, . . . q = 1 − p
15!
(b) P (X = j, Y = k) = j!k!(15−j−k)! ( 12 )j ( 13 )k ( 61 )15−j−k
for all admissible non negative integral values of j and k.
ke−3x1 −4x2 , x1 > 0, x2 > 0

13. Find k, if the joint probability density of (X1 , X2 ) is fX1 X2 (x1 , x2 ) = . Also
0, otherwise
find the probability that the value of X1 falls between 0 and 1 while X2 falls between 0 and 2.

1

1, 0 < x, y < 1
14. Suppose that (X, Y ) has joint pdf f (x, y) = . Find the pdf of X + Y .
0, otherwise
15. Romeo and Juliet have a date at a given time, and each, independently, will be late by an amount of time,
denoted by X and Y respectively, that is exponentially distributed with parameter λ. Find the pdf of, X −Y ,
the difference between their times of arrival?
16. Let X, Y and Z be independent and identically distributed random variables each having a uniform distri-
bution over the interval [0, 1]. Find the joint density function of (V, W ) where V = XY and W = Z 2 .
17. The random variable X represents the amplitude of cosine wave; Y represents the amplitude of a sine wave.
Both are independent and uniformly distributed over the interval (0,1). Let R represent the amplitude of
their resultant, i.e., R2 = X 2 + Y 2 and θ represent the phase angle of the resultant, i.e., θ = tan−1 (Y/X).
Find the joint and marginal pdfs of θ and R.
18. Let X and Y be continuous random variables having joint distribution which is uniform over the square
which has corners at (2, 2), (−2, 2), (−2, −2) and (2, −2). Determine P (|Y | > |X| + 1).
19. Suppose that, we choose a point (X, Y ) uniformly at random in E where E = {(x, y) | |x| + |y| ≤ 1}. Find

20
the joint pdf of (X, Y ).
20. Let X and Y be independent rvs with X follows Exp(1) and Y follows U [0, 1]. Let Z = max{X, Y }. Find

9-
P (Z ≤ z).
21. Mr. Ram and Mr. Ramesh agree to meet between 5:00 PM and 6:00 PM, with the understanding that each

other. What is the probability that they will meet?


2 01
will wait no longer than 20 minutes for the other (and if other does not show up they will leave). Suppose
that each of them arrive at a time distributed uniformly at random in this time interval, independent of the

 −u
e , 0<x<y<z<u<∞
22. Let (X, Y, Z, U ) be a 4-dim rv with joint pdf fX,Y,Z,U (x, y, z, u) = . Find
er
0, otherwise
the marginal pdf of X.
t

23. Let A, B and C be independent random variables each with uniform distributed on interval (0, 1). What is
es

the probability that Ax2 + Bx + C = 0 has real roots?


24. Aditya and Aayush work independently on a problem in Tutorial Sheet 4 of Probability and Stochastic
em

Processes course. The time for Aditya to complete the problem is exponential distributed with mean 5
minutes. The time for Aayush to complete the problem is exponential distributed with mean 3 minutes.
IS

(a) What is the probability that Aditya finishes the problem before Aayush?
(b) Given that Aditya requires more than 1 minutes, what is the probability that he finishes the problem
before Aayush?
(c) What is the probability that one of them finishes the problem a minute or more before the other one?

25. Let A, B and C be independent random variables. Suppose that A and C are uniformly distributed on [0, 1].
Further, B is uniformly distributed on [0, 2]. What is the probability that Ax2 + Bx + C = 0 has real roots?
26. Prove that the correlation coefficient between any two random variables X and Y lies in the interval [−1, 1].
27. For each fixed λ > 0, let X be a Poisson distributed random variable with parameter λ. Suppose λ itself is
a random variable following a gamma distribution with pdf
 1 n−1 −λ
f (λ) = Γ(n) λ e , λ>0
0, otherwise
where n is a fixed positive constant. Find the pmf of the random variable X.

2
28. The number of pages N in a fax transmission has geometric distribution with mean 4. The number of bits
k in a fax page also has geometric distribution with mean 105 bits independent of any other page and the
number of pages. Find the probability distribution of total number of bits in fax transmission.
29. It is known that a IIT bus will arrive at random at Nilgiri hostel bus stop sometime between 8:30 A.M. and
8:45 A.M. Rahul decides that he will go at random to this location between these two times and will wait at
most 5 minutes for the bus. If he misses it, he will take the cycle rickshaw. What is the probability that he
will take the cycle rickshaw?
30. Let (X, Y ) be a two-dimensional continuous type random variables with joint pdf

f (x, y) = xe−x(y+1) for x, y > 0 and f (x, y) = 0 otherwise.

Define Z = XY . Then, find P (Z ≤ z)?


31. Let (X, Y ) ba a two-dimensional random variable with joint pdf
1 1
f (x, y) = p e− 2 Q(x,y) , − ∞ < x, y < ∞
2πσ1 σ2 1 − ρ 2

20
where σ1 , σ2 > 0, | ρ |< 1 and

9-
" 2    2 #
1 x − µ1 (x − µ1 )(y − µ2 ) y − µ2
Q(x, y) = − 2ρ +
(1 − ρ2 )

Find the conditional pdf fY /X (y/x).


σ1
2 01 σ1 σ2

32. Let X0 be an integer-valued random variable, P (X0 = 0) = 1, that is independent of the i.i.d. sequence
σ2

Z1 , Z2 , . . . , where Zn can take values in the set {−1, 1} such that P (Zn = −1) = 25 , P (Zn = 1) = 35 . Let
er
Xn = Xn−1 + Zn , n = 1, 2, . . ..
t

(a) Find P (X3 = 1)?


es

(b) Find the value of P (X5 = −1 | X2 = 0)?


em
IS

3
Department of Mathematics
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 3
Answer for selected Problems
1

 2n+1 , k=0
2
1. (i) P [| X |= k] = 2n+1 , k = 1, 2, . . . , n
0, otherwise

1

 2n+1 , k=0
2
(ii) 2
P [X = k] = 2n+1 , k = 12 , 22 , . . . , n2
0, otherwise

1

 2n+1 , k=1
2
(iii) 1
P [ |X|+1 = k] = 2n+1 , k = 12 , 13 , . . . , n1
0, otherwise

20
2. 0.51
3. Y is uniformly distributed random variable on the interval (a, b)

9-
−1 y−(a+µb)
2
4. (i)fY (y) = |b|√12πσ e 2 bσ , −∞<y <∞

(ii)fZ (z) =
(
√1 e
2πz
0,
−z
2 , z>0
otherwise
2 01
er
αey
5. fY (y) = (ey +1)α+1 , −∞<y <∞
t

6.
es

 −1 √ √ 


λe√
2 y e
λ y
+ e−λ y , 0<y< 1
λ2

em

−1
7. fY (y) = λe√ −λ y 1
 2 y e , λ2 <y<∞
0, otherwise

IS

e−y + 1 −1/y

8. (a) Y is a continuous type random variable. (b) fY (y) = y2 e , 0<y<1
.
0, otherwise

 0, −∞ < y < 2
y
9. F (y) = 10 , 2≤y<4
1, 4≤y<∞

10. α = e−λ
q 4
1
11. fY (y) = 2
π | y | e− 2 y , −∞<y <∞


1
4 y, 0<y<1
 √

1
12. fY (y) = 8 y, 1<y<9


 0, otherwise
(
√1 , −1 < y < 1
13. fY (y) = π 1−y 2
0, otherwise

1
14. Z has mixed type distribution where pmf is given by
 1
4, z = −1, 1
P [Z = z] =
0, otherwise

and density function given by  1


π(1+z 2 ) , −1 < z < 1
fZ (z) =
0, otherwise

  
n
px q n−x



 x
, x = 0, 1, . . . , r − 1
  
15. P [X = x] = Pr−1 n
pi q n−i
i

 i=0


0, otherwise

(
1
exp(− 12 ( x−µ 2
σ ) ), α<x<β

20

16. fX (x) = 2πσ(φ( β−µ α−µ
σ )−φ( σ ))
0, otherwise

9-
17. exp(−3/8)

18. 1
2

19. (a) Bin(24,0.4) (b) Bin(26,0.6)


2 01
20. (a) U(0,1) (b) Exp(1)
er

21. (a) k=2


t

22. 1 − exp − 31 , 30 minutes



es

23. (a) e−λ


em

24.
2 2
25. eσ t /2
IS

n − odd

n
0
E(X ] = n!
(n/2)!2n/2
σn n − even
2
26. P (−1.062 < X < 0.73) = 3

27.
28. Y = ln(X)
fX (x) = 2x1√π exp − 14 (ln x)2 , 0 < x < ∞


93
30. X = No. of games played, P (X = k) = pk (> 0), k = 4, 5, 6, 7 E(X) = 16

 1
√ √
√ , − 3<y< 3
32. (a) fY (y) = 2 3 ; P (−2 < Y < 2) = 1.
0, otherwise
(b) 50

2
 1
√ √
√ , − 3<y< 3
34. (a) fY (y) = 2 3 ; P (−2 < Y < 2) = 1.
0, otherwise
exp(t)−1
(b) 50 (c) t

35. E(Y ) = 1, V ar(Y ) = 1 where Y =| X |


P7 −µ k
36. (a) X ∼ P (µ) (b) k=1 e k!µ ; µ = 4

37. 11.125
38. E[X 2 ] = λ + λ2 , V ar(X) = λ, E[X 3 ] = λ3 + 3λ2 + λ

20
9-
2 01
t er
es
em
IS

3
MTL 106 (Introduction to Probability Theory and Stochastic Processes)
Tutorial Sheet No. 3 (Function of a Random Variable)
1. X has a uniform distribution over the set of integers {−n, −(n − 1), . . . , −1, 0, 1, . . . , (n − 1), n}. Find the
distribution of (i) |X| (ii) X 2 (iii) 1/1 + |X|.

2. Let P [X ≤ 0.49] = 0.75, where X is a continuous type RV with some CDF defined over (0,1). If Y = 1 − X,
Find k so that P [Y ≤ k] = 0.25.
3. Let X be uniformly distributed random variable on the interval (0, 1). Define Y = a + (b − a)X, a < b.
Find the distribution of Y .
 2
4. If X has N (µ, σ 2 ), find the distribution of Y = a + bX, and Z = X−µ
σ .

αθ α
5. Let X be a random variable with pdf f (x) = (x+θ)α+1 , x > 0 where θ > 0 and α > 0. Find the distribution
of random variable Y = ln X

θ .

6. Suppose that X is a random variable having a geometric distribution with parameter p. Find the probability

20
mass function of X 2 and X + 3.
2
7. Let X be an random variable having an exponential distribution with parameter λ > 0. Let Y = X − λ1 .

9-
Find the pdf of Y .

−x X, X < 1

01
8. Suppose that X is a continuous random variable with pdf fX (x) = e for x > 0. Define Y =

(a) Discuss whether the distribution of Y is discrete or continuous or mixed type.


2
(b) Determine the pmf/pdf as applicable to this case.
1
X, X ≥ 1
.
er
9. Let X be uniformly distributed random variable over the interval [0, 10]. Find the CDF of
Y = max{2, min{4, X}}.
t

10. Let X be the life length of an electron tube and suppose that X may be represented as a continuous random
es

variable which is exponentially distributed with parameter λ. Let pj = P (j ≤ X < j + 1). Show that pj is
of the form (1 − α)αj and determine α.
em

11. Consider a nonlinear amplifier whose input X and output Y are related by its transfer characteristic
 1
X2, X>0
IS

Y = 1
−|X| 2 , X < 0

Find pdf of Y if X has N (0, 1) distribution.


12. A point X is chosen at random in the interval [−1, 3]. Find the pdf of Y = X 2 .

13. Let the phase X of a sine wave be uniformly distributed in the interval (− π2 , π2 ). Define Y = sin X. Find the
distribution of Y .
14. Let X be a random variable with uniform distribution in the interval (−π/2, π/2). Define

 −1 X ≤ −π/4
Z= tan(X) −π/4 < X < π/4
1 X ≥ π/4.

Find the distribution of the random variable Z.


15. Find the probability distribution of a binomial random variable X with parameter n, p, truncated to the
right at X = r, where r is a positive integer.

1
16. Find pdf of a doubly truncated normal N (µ, σ 2 ) random variable, truncated to the left at X = α and to the
right at X = β, where α < β.
17. Assume that, the number of years a car will run is exponential distributed with mean 8. Suppose, Mr. Alok
buys a four years running used car today, what is the probability that it will still run after 3 years?
18. Let U be a uniform distributed rv on the interval [0, 1]. Find the probability that the quadratic equation
x2 + 4U x + 1 = 0 has two distinct real roots x1 and x2 ?
19. A club basketball team will play a 50-game season. Twenty four of these games are aginst class A teams and
26 are against class B teams. The outcomes of all the games are independent. The team will win each game
against a class A opponent with probability 0.4 and it will win each game against a class B opponent with
probability 0.6. Let XA and XB denote, respectively, the number of victories against class A and class B
teams. Let X denote its total victories in the season.
(a) What is the distributions of XA ?
(b) What is the relationship between XA , XB and X?
(c) What is the distribution of X?

20
20. Let X be a continuous type random variable with strictly increasing CDF FX .

9-
(a) What is the distributions of X?
(b) What is the distribution of the random variable Y = − ln(FX (X))?.

01
21. Let X be a continuous type random variable having the pdf f (x) =
2 
 0,
1
 21
,
−∞ < x ≤ 0
0<x≤1
kx2 , 1 < x < ∞

(a) Find k?
er

1
(b) Find the pdf of Y = X?
t

22. In the claim office with one employee of a public service enterprise, it is known that the time (in minutes)
es

that the employee takes to take a claim from a client is a random variable which is exponential distribution
with mean 15 minutes. If you arrive at 12 noon to the claim office and in that moment there is no queue
em

but the employee is taking a claim from a client, what is the probability that you must wait for less than 5
minutes to talk to the employee? What is the mean spending time (including waiting time and claim time)
of you?
IS

23. Let X be exponentially distributed random variable with parameter λ > 0.



(a) Find P | X − 1 |> 1 X > 1
(b) Explain whether there exists a random variable Y = g(X) such that the cumulative distribution function
of Y has uncountably infinite discontinuity points. Justify your answer.
24. Let X be a continuous random variable having the probability density function (pdf)

fX (x) = kx2 e−x+1 , x > 1

(a) Find k?
(b) Determine E(X)
(c) Find the pdf of Y = X 2 ?
25. Let X be a random variable with N (0, σ 2 ). Find the moment generating function for the random variable
X. Deduce the moments of order n about zero for the random variable X from the above result.

2
26. The moment generating function (MGF) of a random variable X is given by MX (t) = 61 + 14 et + 13 e2t + 14 e3t .
If µ is the mean and σ 2 is the variance of X, what is the value of P (µ − σ < X < µ)?
 [x]
q , x≥0
27. Let X be a rv such that P (X > x) = where 0 < q < 1 is a constant and [x] is integral part
1, x<0
of x. Determine the pmf/pdf of X as applicable to this case.
28. Let ln(X) be a normal distributed random variable with mean 0 and variance 2. Find the pdf of X?
29. Prove that, if X is a continuous type rv such that E(X r ) exists, then E(X s ) exists for all s < r.
30. Suppose that two teams are plying a series of games, each of which is independently won by team A with
probability 0.5 and by team B with probability 0.5. The winner of the series is the first team to win four
games. Find the expected number of games that are played.

31. Let Φ be the characteristic function of a random variable X. Prove that 1− | Φ(2u) |2 ≤ 4 1− | Φ(u) |2 .
32. (a) Let X be a uniformly distributed random variable on the interval [a, b] where −∞ < a < b < ∞. Find
the distribution of the random variable Y = X−µ
σ where µ = E(X) and σ 2 = V ar(X). Also, find

20
P (−2 < Y < 2).
(b) Suppose Shimla’s temperature is modeled as a random variable which follows normal distribution with

9-
mean 10 Celcius degrees and standard deviation 3 Celcius degrees. Find the mean if the temperature
of Shimla were expressed in Fahreneit degrees.

E

1
X +1

=
2 01
33. Let X be a random variable having a binomial distribution with parameters n and p. Prove that

1 − (1 − p)n+1
(n + 1)p
.
er
34. Let X be a continuous random variable with CDF FX (x). Define Y = FX (X).
t

(a) Find the distribution of Y .


es

(b) Find the variance of Y , if it exist?


(c) Find the characteristic function of Y ?
em

35. Suppose that X is a continuous random variable having the following pdf:
 ex
2 , x≤0
IS

f (x) = e−x
2 , x > 0.

Let Y = |X|. Obtain E(Y ) and V ar(Y ).


36. The mgf of a r.v. X is given by MX (t) = exp(µ(et − 1)).
(a) What is the distribution of X? (b) Find P (µ − 2σ < X < µ + 2σ), given µ = 4.
37. Let X be a discrete type rv with moment generating function MX (t) = a+be2t +ce4t , E(X) = 3, Var(X) = 2.
Find E(2X )?
38. Let X be a random variable
 with Poisson distribution with parameter λ. Show that the characteristic function
of X is ϕX (t) = exp λ(eit − 1) . Hence, compute E(X 2 ), V ar(X) and E(X 3 ).

3
Department of Mathematics
MTL 106 (Introduction to Probability and Stochastic Processes)
Tutorial Sheet No. 2
Answer for selected Problems
1. X(i) = i, i = 0, 1, 2
2. F = P (Ω)
3. a)No b)No c)Yes
4. (a) 0.002 (b) 0.7255
5
5. 9

6. FX (x) = αFd (x) + (1 − α)Fc (x) where α = 12 ,



0, x<1 
0≤x<2
 0, 2


 2
25 , 1≤x<2 (x −4)
Fd (x) = ; Fc (x) = , 2≤x<3
 45 , 2 ≤ x < 3

20
5
x≥3



1 x≥3 1,

1 3 1
7. α = 10 ; β= 64 ; P (X < 3/X ≥ 2) =

9-
2

8. [1 − (0.95)52 −52 C1 (0.05)(0.95)51 ]


9. P [X ≥ 2] = [1 − [(1 − p)n +

10.
n ' 4742
n
C1 p1 (1 − p)n−1 ]] ≥ 0.95
2 01
where p = 0.001

11. exp(−p)
er

12. (1 − 0.001)1200
t

14. α = 1 − p, 0 < p < 1


es

 2x
r22 −r12
, r1 ≤ x ≤ r2
15. fX (x) =
em

0, otherwise
17. 0.99997
18. (a) yes (b) no
IS

 1

19. Mean= T1 1 − e− 4
 1

Variance = T1 1 − e− 4

20. a) 0.75 b) 0.5


Ck (e−λt )k (1 − e−λt )n−k , k = 0, 1, 2, . . . , n
 n
21. P [Nt = k] =
0, otherwise
22. (a) 91.6% (b)56.8%
23. (a)D2 , (b)D2
1−p
24. λ (1 − exp(−4λ))
28. (a) 0, (b) 1, 0
29. α = 35 , β = 6
5

30. 1

You might also like