Chapter 5,6 Regression Analysis
Chapter 5,6 Regression Analysis
Dr Sunil D Lakdawala
[email protected]
Regression Line
9/30/16
Simple Regression
9/30/16
9/30/16
Problem
9/30/16
a = Y - b* X
9/30/16
9/30/16
Correlation Analysis
9/30/16
INTERPRETATION of r2
9/30/16
Coefficient of Correlation
r = sqrt (r2)
See fig 12.16
If r = 0.6, how good is regression? How much
variation in Y is explained by regression?
9/30/16
10
Sb = Se / (Xi2 nX2)
9/30/16
11
The Equation
9/30/16
12
9/30/16
13
Y = 0 + 1 * X +
1. Residual:
2.
Residual
3.
9/30/16
14
5.
6.
7.
9/30/16
15
9/30/16
16
13.
14.
15.
9/30/16
17
9/30/16
18
Summary Output
9/30/16
19
Residuals
It is the
difference
between the
actual Y value
and predicted Y
value by the
regression
model in
predicting each
value of the
dependent
variable.
9/30/16
20
Residuals
9/30/16
21
Coefficient of Determination
9/30/16
22
r2 in Airlines Cost
r2 = .899 [pg6,12,13]
This means that about 89.9% of the
variability of the cost of flying a Boeing 737
airplane on a commercial flight is accounted
for or predicted by the number of passengers.
This also means that about 11.1% of the
variation in airline flight cost, Y, is
unaccounted for by X or unexplained by the
regression model.
9/30/16
23
Correlation
It is a measure of
association. It measures
the strength of relatedness
of two variables.
For example, we may be
interested in determining
the correlation between
9/30/16
24
Correlation
1. The measure is applicable only
if both variables being analyzed
have at least an interval level of
data.
2. r is a measure of the linear
correlation of two variables.
3. r = +1 denotes a perfect positive
relationship between two sets of
variables.
4. r = -1$ denotes a perfect
negative correlation, which
indicates an inverse relationship
between two variables.
5. r=0 means that there is no linear
relationship between the two
variables.
6. The coefficient of determination
= (correlation coefficient) r2
9/30/16
25
9/30/16
26
9/30/16
27
1.
2.
9/30/16
28
3.
4.
5.
6.
9/30/16
29
Problem
9/30/16
30
9/30/16
31
r2 = 0.715
Significance Tests
of the Regression
Coefficients
9/30/16
32
9/30/16
33
Analysis of Residuals
9/30/16
34
Multicollinearity
9/30/16
Search Procedures
9/30/16
36
9/30/16
37
9/30/16
38
Non-Linear Model
Y=0+1X1+ 2X2 **2 +
Choose Y1 = X ;Y = X **2
1
2
2
Y=0+1X1+ 2X1X2 +
Choose Y1 = X ;Y = X * X
1
2
1
2
Y = 0*1X
Log(Y) = Log( ) +X*Log( ); Now it is in
0
1
linear form
Similarly Y = 0*X1 ; Y = 1 / (0+1X1+ 2X2 )
can be converted into linear regression
9/30/16
39
9/30/16
40
Holiday Effect
9/30/16
41
9/30/16
42
9/30/16
43
Miscellaneous
Variance - Covariance Matrix
Vector X = (X1, X2, X3, )
Let (i) be arithmetic average of X(i)
(i.j) = k (X(i,k) - (i))*(X(j,k) - (j))/ N
(i.i) is Variance Matrix
9/30/16
44