Lecture #1
Lecture #1
2
.
2 Log Linear Log(y)=
1
+
2
log(
x)
2
.
2
3 Log-in Log(Y)=
1
+
2
X
2
.(Y)
2
.(X)
4 Lin log y=
1
+
2
log(
x)
2
5 Reciprocal Y=
1
+
2
-
2
2
Assumptions of Standard, or classical linear
regression Model
1. The regression model is linear in the parameters. Y=+X+u
2. X values are fixed in repeated sampling
3. Zero mean value of disturbance U
i
i.e E(U
i
) =0
4. Homoscadasticity or equal variance of u
i
Var(u
i
)=
2
5. No autocorrelation between the disturbances, the
correlation between any two u
i
and u
j
ij is zero Cov(u
i
, u
j
)
=0
6. Zero covariance between u
i
and X
i
E(u
i
X
i
)=0
7. The number of observations n must be greater than the
number of parameters to be estimated. Alternatively, the
number of observations n must be greater than the number
of explanatory variables
8. Variability in X values. The X values in a given sample must
not all be the same. Var(X) must be a finite positive number.
9. The regression model is correctly specified. Alternatively,
there is no specification bias or error
10. There is no perfect multicollinearity. That is, there is no
perfect linear relationship among the explanatory variable.
Assumptions of Standard, or classical linear
regression Model
Actual model
Now, the econometrician does not know the true values of
parameters and . we will use statistics to infer estimates of
these values based on the observation of the data. Of course,
these estimates will be a little wrong: they will differ from the
true values and . We will denote them: and . The
true equation can be substituted by the estimated equation:
u X Y + + = | o
c | o + + = X Y
o
|
o
It is easy to compute the formulae
estimated regression line
represents the fitted or predicted value
( )( )
( )
=
=
=
N
i
i
i
N
i
i
X X
X X Y Y
1
2
1
|
X Y | o
=
i i
X Y | o
+ =
i
Y
=
= =>
2 2
R r
outcome in the variation total
predictor by the explained variation
2
R
2
Confidence interval for
With (n-2) df
Pr [
2
t
/2
se (
2
)
2
2
+ t
/2
se (
2
)] = 1
Pr [
1
t
/2
se (
1
)
1
1
+ t
/2
se (
1
)] = 1
2
1
Testing of hypothesis
ANOVA
F = (MSS of ESS) / (MSS of RSS)
=
2
2
x
2
i
/ ( u
2
i
/ (n 2))
=
2
2
x
2
i
/
2