0% found this document useful (0 votes)
36 views2 pages

EEPF Finals

EE3110 is a final exam for a probability foundations course for electrical engineering students. The 3-hour exam consists of 5 multi-part questions testing concepts including random variables, probability mass functions, maximum likelihood estimation, hypothesis testing, and more. Example distributions covered include the uniform, geometric, and normal distributions. Common estimators, test statistics, and hypotheses are also assessed. Formulas for expectations, variances, common distributions, and statistical tests are provided.

Uploaded by

nah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views2 pages

EEPF Finals

EE3110 is a final exam for a probability foundations course for electrical engineering students. The 3-hour exam consists of 5 multi-part questions testing concepts including random variables, probability mass functions, maximum likelihood estimation, hypothesis testing, and more. Example distributions covered include the uniform, geometric, and normal distributions. Common estimators, test statistics, and hypotheses are also assessed. Formulas for expectations, variances, common distributions, and statistical tests are provided.

Uploaded by

nah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE3110: Probability Foundations for EE

Marks: 50 Final exam (even section), Nov 9, 2022 Time: 3 hours

All answers should be justified with clear reasons.

1. (5 × 2 marks) Provide an example for each of the following, or argue why no example is possible.

(a) Random variable X such that E[ X 2 ] < E[ X ]2 .


(b) Function g( x ) such that, for X ∼ U [0, 1], g( X ) ∼ Geometric(1/2).
(c) Jointly distributed random variables X, Y such that E[ X ] 6= 0, E[Y ] 6= 0, E[ XY ] = 0.
(d) Jointly distributed random variables X, Y such that X, Y ∼ N (0, 1), X + Y is not normal.

 X such
(e) Random variable that E[ X ] = 0, var( X ) = 1/3, and, for X1 , . . . , Xn ∼ iid X, the sample
mean X n ∼ U −n1 , n1 .


2. (10 marks) In a virtual cricket league, suppose the runs scored off any delivery is distributed as
3/8 1/2 1/8
{ 0 , 1 , 2 }. In the first two deliveries, (1) the probability of scoring (2, 2) is 0, (2) the probability of
−1
scoring (1, 2) is equal to that of (2, 1), and (3) the covariance of runs scored is .
16
(a) (5 marks) Develop a joint PMF model for the runs scored in the first two deliveries. How many
parameters (or unknowns) are needed in the model (after using all provided information)?
(b) (5 marks) In a new season, the runs in the first two deliveries of 8 matches are observed to be
(0, 0), (0, 1), (1, 0), (0, 0), (0, 2), (0, 0), (0, 1), (1, 1). What is the ML estimate of the parameters of
the above joint PMF model? State your assumptions about the data.

3. (10 marks) HoD (EE) suspects that the academic performance of BTech students in EE, as measured
by CGPA after 8 semesters, has deteriorated from the graduating class of 2012 to that of 2022 - both
on average (which was 8.0 in 2012) and on variability across students.

(a) (4 marks) Develop a test to statistically verify the suspicion of the HoD on average perfor-
mance. Describe all aspects of the test (sampling, null/alternative hypothesis, test statistic,
significance level computation). State all your assumptions clearly.
(b) (4 marks) Repeat the above for the suspicion that variability in performance has increased.
(c) (2 marks) Suppose Dean (AC) has the same suspicions for all BTech students of IITM. Do you
have to alter any aspect of your sampling and tests?

4. (10 marks) From the samples X1 , X2 ∼ iid U [ a, b], we desire to estimate θ = b − a.

(a) (6 marks) Assuming a and b are parameters, let ( â ML , b̂ ML ) be ML estimates of ( a, b). Consider
the estimate θ̂plugin = b̂ ML − â ML for θ. Find the bias and MSE of θ̂plugin .
(b) (4 marks) Consider the class of estimators θ̂ (α) = α(b̂ ML − â ML ) for θ. Find α for which θ̂ (α) is
unbiased. Find α for which θ̂ (α) has least MSE.

5. (10 marks) n children are chosen uniformly at random from a group of 2n children and given one
chocolate each. The group consists of n brother-sister pairs. A brother-sister pair is termed lucky if
they both got chocolates, and unlucky if neither got. Assume n is even.

(a) (4 marks) Find the expected number of lucky brother-sister pairs. Repeat for unlucky pairs.
(b) (6 marks) Find PMF of the number of lucky brother-sister pairs. Repeat for unlucky pairs.
Distributions
Distribution PMF/PDF CDF E[ X ] var( X )

 0 x<a
( b − a )2

x − a
1 a+b
U [ a, b] ,a<x<b a<x<b
b−a 
 b−a 2 12
1 x>b

(
1 − (1 − p ) b k c k ≥ 1 1 1− p
Geometric( p) (1 − p)k−1 p, k = 1, 2, . . .
0 else p p2
 
2 2 x −µ
N (µ, σ2 ) √ 1 e−( x −µ) /2σ FZ σ , Z ∼ N (0, 1) µ σ2
2πσ

Formulae
(R
x
g( x ) f X ( x )dx PDF f X ( x )
Expected value E[ g( X )] =
(R ∑ x g( x ) p X ( x ) PMF p X ( x )
x,y
g( x, y) f XY ( x, y)dxdy Joint PDF f XY ( x, y)
E[ g( X, Y )] =
∑ x,y g( x, y) p XY ( x, y) Joint PMF p XY ( x, y)
[Co]variance var( X ) = E[ X 2 ] − E[ X ]2 , Cov( X, Y ) = E[ XY ] − E[ X ] E[Y ]
X1 +···+ Xn ( X1 − X n )2 +···+( Xn − X n )2
Sample statistics X1 , . . . , Xn ∼ iid X, X n = n , Sn2 = n −1

E[ X n ] = E[ X ], var( X n ) = var( X )/n, E[Sn2 ] = var( X )


Estimation X1 , . . . , Xn ∼ iid f X ( x; θ ), θ̂: function of X1 , . . . , Xn
Bias(θ̂ ) = E[θ̂ − θ ], MSE(θ̂ ) = E[(θ̂ − θ )2 ]
ML estimate X1 , . . . , Xn ∼ iid f X ( x; θ ), θ̂ ML = arg maxθ ∏in=1 f X ( Xi ; θ )
( X1 , Y1 ), . . . , ( Xn , Yn ) ∼ iid f XY ( x, y; θ ), θ̂ ML = arg maxθ ∏in=1 f XY ( Xi , Yi ; θ )
Testing X1 , . . . , Xn ∼ iid P, Null: P = P0 , Alternative: P ∈ another set
T: function of X1 , . . . , Xn , Acceptance set: A, Test: Reject H0 if T ∈ Ac
Significance level: α = P( T ∈ Ac | P0 ), Power against P1 : P( T ∈ Ac | P1 )
z test X1 , . . . , Xn ∼ N (µ, σ2 ), σ2 : known, T = X n ∼ N (µ, σ2 /n)
Null: µ = µ0 , Alternatives: µ < µ0 , or µ > µ0 , or µ 6= µ0
Test: Reject H0 if T < c, or T > c, or | T − µ0 | > c
X n −µ
t test X1 , . . . , Xn ∼ N (µ, σ2 ), σ2 : unknown, T = √
Sn / n
∼ t n −1
n −1
χ2 test X1 , . . . , Xn ∼ N (µ, σ2 ), T = Sn , σ2
Sn2 ∼ χ2n−1
Null: σ = σ0 , Alternatives: σ < σ0 , or σ > σ0 , or σ 6= σ0
X1 , . . . , Xn1 ∼ N (µ1 , σ12 ) σ2 σ22
Two sample z test , T = X n1 − Y n2 ∼ N (µ1 − µ2 , n11 + n2 )
Y1 , . . . , Yn2 ∼ N (µ2 , σ22 )
Null: µ1 = µ2 , Alternatives: µ1 < µ2 , or µ1 > µ2 , or µ1 6= µ2
Test: Reject H0 if T < c, or T > c, or | T | > c
X1 , . . . , Xn1 ∼ N (µ1 , σ12 ) S2X,n1 S2X,n1 /σ12
Two sample F test , T = , 2 ∼ F (n1 − 1, n2 − 1)
Y1 , . . . , Yn2 ∼ N (µ2 , σ22 ) 2
SY,n 2
SY,n2 /σ22
Null: σ1 = σ2 , Alternatives: σ1 < σ2 , or σ1 > σ2 , or σ1 6= σ2
Test: Reject H0 if T < 1 − c, or T > 1 + c, or T −1 ∈
/ [−c L , c R ]

You might also like