4 - The Multiple Linear Regression - Parameter Estimation
4 - The Multiple Linear Regression - Parameter Estimation
REGRESSION:
PARAMETER ESTIMATION
Widya Irmaningtyas
Program Studi Statistika
FMIPA UGM
INTRODUCTION
The simple model studied in the previous chapters is
often inadequate in practice.
where
X 1i 1
X
2i 2
X i X 3i , 3
X ki k
THE MULTIPLE LINEAR REGRESSION
MODEL
Equation (1) is a shorthand expression for the following set of n simultaneous
equations:
Y1 1 X 11 2 X 21 3 X 31 k X k1 u1
Y2 1 X 12 2 X 22 3 X 32 k X k 2 u2
(2)
Yn 1 X 1n 2 X 2 n 3 X 3n k X kn un
Let us write the system of equations (2) in an alternative but more illuminating way as
follows
Y1 X 11 X 21 X 31 X k1 1 u1
Y X X 22 X 32 X k 2 u
2 12 2
2
Yn X 1n X 2n X 3n X kn
n un
y = X + u
n 1 nk k 1 n 1
THE CLASSICAL OLS ASSUMPTIONS
OLS1. The n k matrix X is deterministic, that is, it consists of a set of xed
numbers.
OLS2. The rank of matrix X is equal to k , where k is the number of
columns in X and k is less than the number of observations, n.
OLS3. E u 0
OLS4. E uu' 2 I
OLS5.
u ~ N 0, 2 I
THE OLS ESTIMATION
The idea of OLS estimation is to minimize the sum of
squared residuals associated with a regression model.
To obtain the OLS estimate of , let us rst write the k-
variable sample regression
Yi 1 X 1i 2 X 2i 3 X 3i k X ki ui
which can be written in matrix notation as
y = X + u (3)
where is a k-element column vector of the OLS
estimators of the regression coefcients,
and u is an n 1 column vector of n residuals.
THE OLS ESTIMATION
The OLS estimators are obtained by minimizing ui
2
In matrix notation, this amounts to minimizing u'u
y - X
'
= y - X
u'u
y'y - 2'X'y
+ 'X'X
The estimator n
ui
2
u ' u
2
i 1
nk nk