0% found this document useful (0 votes)
53 views

A Matrix Formulation of The Multiple Regression Model

This document provides a matrix formulation of the multiple regression model. It shows how a simple linear regression model with n equations can be written compactly using matrix notation as Y=Xβ+ε, where X is an n×2 matrix, Y is an n×1 vector of responses, β is a 2×1 vector of coefficients, and ε is an n×1 vector of errors. It also expresses the least squares estimates of the coefficients as β^ = (X'X)^-1X'Y using matrix inverse and transpose operations.

Uploaded by

ali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views

A Matrix Formulation of The Multiple Regression Model

This document provides a matrix formulation of the multiple regression model. It shows how a simple linear regression model with n equations can be written compactly using matrix notation as Y=Xβ+ε, where X is an n×2 matrix, Y is an n×1 vector of responses, β is a 2×1 vector of coefficients, and ε is an n×1 vector of errors. It also expresses the least squares estimates of the coefficients as β^ = (X'X)^-1X'Y using matrix inverse and transpose operations.

Uploaded by

ali
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 5

A matrix formulation of the multiple regression

model

For more information go to:


https://online.stat.psu.edu/stat462/node/132/
A matrix formulation of the multiple regression model

• Consider the following simple linear regression function:


• yi=β0+β1xi+ϵi for i=1,...,n
• If we actually let i = 1, ..., n, we see that we obtain n equations: 
• y1=β0+β1x1+ϵ1
• y2=β0+β1x2+ϵ2
• .
• .
• yn=β0+β1xn+ϵn
•  
• As you can see, there is a pattern that emerges. By taking advantage of
this pattern, we can instead formulate the above simple linear regression
function in matrix notation:
Matrix Notation
• That is, instead of writing out the n equations, using matrix notation, our
simple linear regression function reduces to a short and simple statement:

• Y=Xβ+ϵ
•  
• Now, what does this statement mean? Well, here's the answer:
• X is an n × 2 matrix.
• Y is an n × 1 column vector, β is a 2 × 1 column vector, and ε is an n × 1
column vector.
• The matrix X and vector β are multiplied together using the techniques
of matrix multiplication.
• And, the vector Xβ is added to the vector ε using the techniques of matrix
addition.
Least squares estimates in matrix notation

•  = (X′X)−1X′Y
• where:
• (X'X)–1 is the inverse of the X'X matrix, and
• X' is the transpose of the X matrix.
• As before, that might not mean anything to
you, if you've never studied matrix algebra —
or if you have and you forgot it all!

You might also like