0% found this document useful (0 votes)
83 views8 pages

LM Ques PPR

MCQs based on Linear Models.

Uploaded by

lakshyawork111
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
83 views8 pages

LM Ques PPR

MCQs based on Linear Models.

Uploaded by

lakshyawork111
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
[This question paper contains 6 printed pages.] Your Roll No. Sr. No. of Question Paper : 4691 E Unique Paper Code + 32371402 Name of the Paper : Linear Models Name of the Course : [Link]. (H) STATISTICS under CBCS (LOCF) Semester : TY) Duration : 3 Hours Maximum Marks : 75 Instructions for Candidates 1. Write your Roll No. on the top immediately on receipt of this question paper. 2. Attempt six questions in all, selecting three from each Section. 3. Use of a simple scientific calculator jis allowed. SECTION A 1. For a given model Ynxa = XnxpBpxt + &mx1 with E@) =9,V(@) = 07! and p(X) =p <1, prove that the least timator of B is BLUE. Also, obtain an (12%) P.T.O. squares es! unbiased estimator of 3. reer OSES SST 4691 ie 4691 3 = (Yv Yu m Yn) bE a vector 2 [ait r= 0s tac ea tees Bes 4. (@Avppose Xp Yy Zy i= 1,2, 3 are 9 independent independen| Hates, Ley observations with common variance o? and Q = YAY and Q = YAY be distributed as 2 Srreeeohe TOs On EY) = 8, EZ) — 9,0, with, and n, degrees of freedom respectively, i= 1,273. Find (i) the BLUES of 8, 0, (ii) Show that the necessary and sufficient condition for Q, and Q, to be independently distributeg is A,A; = 0. | cov(4,,0,), and (iii) the BLUE of 0, + 0,. ~ (bill in the missing entries of the partially completed (b) Let, y, and z denote three independent standard peta os “able normal variables. Stating appropriate theorem, ae, | Sources of | Degrees of | Sum ‘of | Mean Squares | Variance find the distribution of homogenous equation of |, veeaioe leo a aE hate 1.50! x second degree, 0.7 1x2 + 0.86xy + 0.36y2 + 0,93,2 Sane oR are + 0.42yz — 0,28xz, (8%,4) fe = “= c Consider a study examining the relationship between a many classes/levels of factor A are exercise and blood pressure in middle-aged men, In oiierccmpsted! i How many observations are analyzed? who exercise 3 jiiy At 0.05 level of significance, can one ise at all conclude that the classes of factor A have different effects? Why? (Use Fy os3.23) = 3.42, Foostraa = 2-75 Foostrsa = 3-74 ip (64,6) Foosan 7 4-74) PaO. 4691 4 NN SECTION B pio the simple linear regression mod, Y= Bo * Bx + € given that E(e Fy ) = 0, Vie) = g3, e's are uncorrelated, (i) Obtain the least s and B,. iy Verity the bias and variance Properties of B, and 8. 4 LY’ Show that cov(f,,f,) = quares estimates of p ; 0 Yo Corresponding to a specified of the regressor variable x. Which of the two intervals is wider? (64,6) level x, a ® {#) Discuss the problem of bi ‘as in regression estimates? Eigl are to be done at Predictor variables 8A and B suggest ht experiments the coded levels G1, 41) of two X; and X;. Two experimenter the following designs : Be 5 4691 Take one observation at each of (X,, X,) = (-1,-1) and (1,1) and take three observations at each of (-1, 1) and (1,-1), ‘ake two observations at each of the four sites. If a model Y = By + B,X, + B,X, + ¢ is to be fitted by the least squares but it is feared there may be some additional quadratic curvature expressed by the extra B,,X? + BX} + B,.X,X, evaluate the anticipated biases in the estimated coefficients By, B,, B, for each design. (byDiscuss the problem of testing for lack of fit in the simple linear regression model. (64,6) Write notes on any two of the following : 4 (a) Ofthogonal Columns of the design matrix / 0 slowe Regression i 64,6 ? Orthogonal Polynomials (64,6) P.T.O. 6 "WD @V6e q . ‘i ae 4 Suitable test ANN ficance of regress; or testin, . eureswion y= B+ g on in elite " the Vurther, sho a alg Bx, ey w tl ba to ’ hat the test an is +g, “quivalent R?(n-k- p, =R (n-k-1) (1-R*)k (b) Show that, for any linear model Y = XB +e, ’, {x@x'x)” *x"}o? Lh (2) /n= trace“. = po” /n. f Suppose that this model contains a B, term in the t first position, and 1 is an nx1 vector of ones. Show that (X’'X)"!X‘l = (1 0... 0 0)’ and that UX(XX)IX'] =n. (61,6). es ion fore GC tr yer X Chow Ree Nibs ps A; * / I ip pt eons ‘} under net Lipdauk Coodd\ a }< For the simple linear regression model yi = fy + Bixi + ei, (7= 1, 2, ..., ), where €~ NID (0, 6%), (Obtain the least square estimates of fy and Py and variance properties of fi, and isto ta conf dh S Prove that, for every estimable function, there exists a unique BLUE. Consider the simple linear regression model: Y = Bo~ Bix + € with E(e)=0, V(e)= 6%, €'s are uncorrelated. Show that ( cov(B.f) (i) E(MSR) = 8, P +BIS. sssion model y. = Bo + Bix: + i, (I= 1, 2, ..., m), where e~ NID (0, 02), Obtain the least 4 ~~ square estimates of Bo and Bi. Show that they are unbiased. Also find their variances, Also, find the unbiased estimator of the error variance, Aor the simple linear S._If Y= (V1, You «sn Yo)! be a vector of n independent standard normal variates. Let Qi= Y'A,Y and Qo= Y'A,Y be distributed as 1° with mi and nz degrees of freedom respectively. Show that the necessary and sufficient condition for Q; and Q: to be independently distributed is A1A2= 0. 6. Fora given model Y,., = X,..Byat Sy. With E(e)=0,V(e)=0°l and p(X) =p 1) observations per cell 28. What do you mean by an additive model? Discuss Tukey’s test for non-additivity in the case of a two-way layout, with one observation per cell. 429. Write notes on any two of the following: (i) General linear model, (ii) Role of orthogonal polynomials in fitting polynomial models in one variable (iii) Bias in regression estimates (iv) Orthogonal columns in X matrix (v) Stepwise regression method (vi) Partial F-test (vii) Coefficient of determination (viii) Residual analysis Ae Suppose X,, Yi, Zi, i= 1, 2, 61, E(Y,) = 6s, E(Zi) = 61-62, i squares. Also find BLUE of ®) + ®. Jf! Consider the model E(Y1) = 2B:-B2-Bs, E(Y2) = Br-Ps, EC estimable functions. 3 Consider the model BC) = 28: + Bo, E(¥2) = Br-Bs, B(YS) Bo and its variance Consider the model E(Y1) = Bi + B2, E(Y2) = 281, E(Y3) = Bi-B2 with usual assumptions. Find sum of squares duc to , are 3n independent observations with common variance o? and expectations EQ) 2, «..n. Find the BLUEs 881, 02, cov(6,,6, )and compute the residual sum of (Y3) = Bo + Bs— 2B4 with usual assumptions. Find the Bi-Bs with usual assumptions. Obtain the BLUE of B+ 33. f error. [Link] the model E(Yy) = ai + Bj i= 1, 2; j= |, 2. Find the condition under which £0, + £0.) +01,B, +m,B, isan estimable function. S- Suppose Y ~ N, (0,1) and let 0 | 2 1 A=z|-1 2 ia 10) | Bla -1 ou (i) Are Y'AY and 2y; + yz independent? (ii) Ate Y'AY and BY independent? wey (iii)Are Y'AY and Y'CY independent? (iv) Are Y'AY and Y'DY independent? (v) Are Y'CY and Y'DY independent? nw 2-1 +1 -12 =r}, ‘Y and yi+yr+ys independent? -1 -1 2 . 37 Four objects A, B, C, D are involved in a weighing experiment, Put together they weighed Y, grams; when A and C Fishy in tie left pan of the balance and B and D are putin the right pan, a weight of Ys grams were necessary in the Fight pan forthe balunee. With A and B in the left pan and C and D in the right pan, Y grams were needed in the right pan. Finally A and D in the left pan and B and C in the right pan, Ye grams were needed in the right pan. If the observations Yi, Y2, Ys, Ya are all subject to uncorrelated ersors with common variance ”, obtain the BLUEs of individual weights and the total weight of the four objects, and variance of the estimate of the net weight of four objects P®. Stating clearly the underlying assumptions of the simple linear regression model through the origin, obtain the least Squares estimate of the regression parameter along with its variance. 3% Develop a prediction interval for the future observation yu corresponding to a specified level Xo of the regressor Variable x in the simple linear regression model. 30/Suppose Y ~N, (0,1) dad let A = 4 va Suppose y (i = 1, 2, .... m) is a random sample from a standard normal distribution, Show that )°y, and a D(y,-9)° are independently distributed. 41. Suppose the hypothesis of homogeneity of k-reatment means is rejected in ANOVA testing for one way classification under fixed effect model, how would you proceed to test the hypothesis of equality of two specific treatment means? 42: Suppose that we have fit the straight-line model =, +B,x, but the response is affected by a second variable xs such that the true regression function is E(y) = Bo + Bixi + Boxe (i) Is the least-squares estimator of the slope in the original simple lipear regression model unbiased? (ii) Show the bias in A3. Consider the simple linear regression modelY = By + Bix + € with E(e) = 0, V(€) = 6 and €’s are uncorrelated. Show that: () E(MSE) = 0? and (ii) E(MSR) = 0? +B3S,,. 44. Write the simple linear regression model in matrix notation. Hence obtain the _ least squares estimators of the unknown parameters and their variances, #5. Suppose that we are fitting a straight line and wish to make the standard error of the slope as small as possible. Suppose that the “region of interest” for x is -1

You might also like