0% found this document useful (0 votes)
171 views

4 - The Multiple Linear Regression - Parameter Estimation

The document discusses the multiple linear regression model, which extends the simple linear regression model to include more than one explanatory variable. The multiple linear regression model relates a dependent variable Y to k explanatory variables. This can be written as a system of n equations with n observations. The Ordinary Least Squares (OLS) method is used to estimate the coefficients in the model by minimizing the sum of squared residuals. The OLS estimators of the coefficients are unbiased with variance that depends on the design matrix X. The estimator for the variance of the error term is also given.

Uploaded by

Miftahul Jannah
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
171 views

4 - The Multiple Linear Regression - Parameter Estimation

The document discusses the multiple linear regression model, which extends the simple linear regression model to include more than one explanatory variable. The multiple linear regression model relates a dependent variable Y to k explanatory variables. This can be written as a system of n equations with n observations. The Ordinary Least Squares (OLS) method is used to estimate the coefficients in the model by minimizing the sum of squared residuals. The OLS estimators of the coefficients are unbiased with variance that depends on the design matrix X. The estimator for the variance of the error term is also given.

Uploaded by

Miftahul Jannah
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

THE MULTIPLE LINEAR

REGRESSION:
PARAMETER ESTIMATION
Widya Irmaningtyas
Program Studi Statistika
FMIPA UGM
INTRODUCTION
The simple model studied in the previous chapters is
often inadequate in practice.

Therefore, we need to extend our simple regression


model to cover models involving more than one
explanatory variables.

Adding more variables leads us to the discussion of


multiple regression models, that is, models in which the
dependent variable, Y depends on two or more
explanatory variables.
THE MULTIPLE LINEAR REGRESSION
MODEL
The simple regression model
Yi 1 2 X i ui

The multiple regression model, with 2 explanatory variables


Yi 1 2 X 2i 3 X 3i ui
The multiple regression model, with (k 1) explanatory variables
Yi 1 2 X 2i 3 X 3i k X ki ui

In general, the multiple regression model can be written as


Yi 1 X 1i 2 X 2i 3 X 3i k X ki ui , i 1, 2,, n (1)
In this model, we have k regressors, one of which may or may not correspond
to a constant, and the others to a number of explanatory variables.
THE MULTIPLE LINEAR REGRESSION
MODEL
Eq(1) can be simplified in a vector form
Yi 1 X 1i 2 X 2i 3 X 3i k X ki ui
Yi Xi' ui

where
X 1i 1
X
2i 2
X i X 3i , 3


X ki k
THE MULTIPLE LINEAR REGRESSION
MODEL
Equation (1) is a shorthand expression for the following set of n simultaneous
equations:
Y1 1 X 11 2 X 21 3 X 31 k X k1 u1
Y2 1 X 12 2 X 22 3 X 32 k X k 2 u2
(2)

Yn 1 X 1n 2 X 2 n 3 X 3n k X kn un
Let us write the system of equations (2) in an alternative but more illuminating way as
follows
Y1 X 11 X 21 X 31 X k1 1 u1
Y X X 22 X 32 X k 2 u
2 12 2
2



Yn X 1n X 2n X 3n X kn
n un

y = X + u
n 1 nk k 1 n 1
THE CLASSICAL OLS ASSUMPTIONS
OLS1. The n k matrix X is deterministic, that is, it consists of a set of xed
numbers.
OLS2. The rank of matrix X is equal to k , where k is the number of
columns in X and k is less than the number of observations, n.
OLS3. E u 0

OLS4. E uu' 2 I

OLS5.
u ~ N 0, 2 I
THE OLS ESTIMATION
The idea of OLS estimation is to minimize the sum of
squared residuals associated with a regression model.
To obtain the OLS estimate of , let us rst write the k-
variable sample regression
Yi 1 X 1i 2 X 2i 3 X 3i k X ki ui
which can be written in matrix notation as
y = X + u (3)
where is a k-element column vector of the OLS
estimators of the regression coefcients,
and u is an n 1 column vector of n residuals.
THE OLS ESTIMATION
The OLS estimators are obtained by minimizing ui
2


In matrix notation, this amounts to minimizing u'u
y - X
'
= y - X
u'u

y'y - 2'X'y
+ 'X'X

Using rules of matrix differentiation given in Appendix B, we


obtain
u'u

2 X'y 2X'X

Setting the preceding equation to zero gives


X'X = X'y
THE ESTIMATORS OF
The estimators
= X'X -1 X'y

The expected value and variance-covariance matrix


of

E =

Var X'X
2 -1
THE ESTIMATOR OF 2

The estimator n

ui
2

u ' u

2

i 1

nk nk

u ' u is also known as Residual Sum of Square (RSS).


Although in principle u ' u can be computed from the
estimated residuals, in practice it can be obtained
directly as follows

u ' u y'y - 'X'y

You might also like