0% found this document useful (0 votes)
43 views

Managerial Economics: Class 12 Demand Estimation

The document discusses various approaches to demand estimation including consumer surveys, observational research, consumer clinics, market experiments, and virtual shopping. It then covers regression analysis, using ordinary least squares to estimate the slope and intercept of a regression line by minimizing the sum of squared errors. An example is provided to demonstrate calculating the slope, intercept, standard error, t-statistic, and decomposition of the sum of squares in regression analysis.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Managerial Economics: Class 12 Demand Estimation

The document discusses various approaches to demand estimation including consumer surveys, observational research, consumer clinics, market experiments, and virtual shopping. It then covers regression analysis, using ordinary least squares to estimate the slope and intercept of a regression line by minimizing the sum of squared errors. An example is provided to demonstrate calculating the slope, intercept, standard error, t-statistic, and decomposition of the sum of squares in regression analysis.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 27

Managerial Economics

CLASS 12
DEMAND ESTIMATION

INSTRUCTOR: ADNAN AHMAD


The Identification Problem
Demand Estimation:
Marketing Research Approaches
 Consumer Surveys
 Observational Research
 Consumer Clinics
 Market Experiments
 Virtual Shopping
 Virtual Management
Regression Analysis
Year X Y
1 10 44
Scatter Diagram
2 9 40
3 11 42
4 12 46
5 11 48
6 12 52
7 13 54
8 13 58
9 14 56
10 15 60
Regression Analysis

 Regression Line: Line of Best Fit

 Regression Line: Minimizes the sum of the


squared vertical deviations (et) of each point from
the regression line.

 Ordinary Least Squares (OLS) Method


Regression Analysis
Ordinary Least Squares (OLS)
 Ordinary least squares (OLS) is a type of linear least squares method
for estimating the unknown parameters in a linear regression model.
 OLS chooses the parameters of a linear function of a set of explanatory
variables by the principle of least squares: minimizing the sum of the
squares of the differences between the observed dependent variable
(values of the variable being observed) in the given dataset and those
predicted by the linear function.

Model: Yt  a  bX t  et
ˆ
Yˆt  aˆ  bX t
et  Yt  Yˆt
Ordinary Least Squares (OLS)

Objective: Determine the slope and


intercept that minimize the sum of the
squared errors.

n n n

t  t t  t
e 2

t 1
 (Y 
t 1
ˆ
Y ) 2
 (Y 
t 1
ˆ
a  ˆ )2
bX t
Ordinary Least Squares (OLS)

Estimation Procedure
n

(X t  X )(Yt  Y )
bˆ  t 1 ˆ
â  Y  bX
n

 t
( X
t 1
 X ) 2
Ordinary Least Squares (OLS)
Estimation Example
Time Xt Yt Xt  X Yt  Y ( X t  X )(Yt  Y ) ( X t  X )2
1 10 44 -2 -6 12 4
2 9 40 -3 -10 30 9
3 11 42 -1 -8 8 1
4 12 46 0 -4 0 0
5 11 48 -1 -2 2 1
6 12 52 0 2 0 0
7 13 54 1 4 4 1
8 13 58 1 8 8 1
9 14 56 2 6 12 4
10 15 60 3 10 30 9
120 500 106 30
n n n
106
n  10  X t  120
t 1
 Yt  500
t 1
(X
t 1
t  X ) 2  30 bˆ 
30
 3.533

n n n
X t 120 Yt 500
X    12 Y    50 (X t  X )(Yt  Y )  106 aˆ  50  (3.533)(12)  7.60
t 1 n 10 t 1 n 10 t 1
Ordinary Least Squares (OLS)
Estimation Example
n
X t 120
n  10 X    12
t 1 n 10
n
n n Yt 500
X  120 Y  500 Y    50
t 1 n 10
t t
t 1 t 1

n
106
(X
t 1
t  X )  30
2 ˆ
b
30
 3.533

(X
t 1
t  X )(Yt  Y )  106 aˆ  50  (3.533)(12)  7.60
Tests of Significance
Standard Error of the Slope Estimate
The standard error of the mean (SEM) measures how far
the sample mean of the data is likely to be from the true
population mean.

sbˆ 
 t
(Y  Yˆ ) 2


t
e 2

(n  k ) ( X t  X ) 2
(n  k ) ( X t  X ) 2
Tests of Significance
Example Calculation
Time Xt Yt Yˆt et  Yt  Yˆt et2  (Yt  Yˆt )2 ( X t  X )2
1 10 44 42.90 1.10 1.2100 4
2 9 40 39.37 0.63 0.3969 9
3 11 42 46.43 -4.43 19.6249 1
4 12 46 49.96 -3.96 15.6816 0
5 11 48 46.43 1.57 2.4649 1
6 12 52 49.96 2.04 4.1616 0
7 13 54 53.49 0.51 0.2601 1
8 13 58 53.49 4.51 20.3401 1
9 14 56 57.02 -1.02 1.0404 4
10 15 60 60.55 -0.55 0.3025 9
65.4830 30

n n n  (Y  Yˆ ) 2
65.4830
 e   (Yt  Yˆt )2  65.4830 (X sbˆ    0.52
t
2
 X )  30
2

t 1
t
t 1 t 1
t ( n  k ) ( X  X )
t
2
(10  2)(30)
Tests of Significance
Example Calculation
n n

 t  t t  65.4830
e 2

t 1
 (Y  Yˆ ) 2

t 1
n

 t
( X
t 1
 X ) 2
 30

sbˆ 
 (Yt  Y )
ˆ 2


65.4830
 0.52
(n  k ) ( X t  X ) 2
(10  2)(30)
Tests of Significance
Calculation of the t Statistic

bˆ 3.53
t   6.79
sbˆ 0.52

Degrees of Freedom = (n-k) = (10-2) = 8


Critical Value at 5% level =2.306
Tests of Significance
Decomposition of Sum of Squares

Total Variation = Explained Variation + Unexplained Variation

 (Yt  Y )   (Y  Y )   (Yt  Yt )
2 ˆ 2 ˆ 2
Tests of Significance
Decomposition of Sum of Squares
Tests of Significance

Coefficient of Determination
The proportion of the variance in the dependent variable
that is predictable from the independent variable.

R2 
Explained Variation

 (Yˆ  Y ) 2

TotalVariation  t
(Y  Y ) 2

373.84
R2   0.85
440.00
Tests of Significance

Coefficient of Correlation
A correlation coefficient is a numerical measure of some type
of correlation, meaning a statistical relationship between two
variables.

r  R 2 with the sign of bˆ

1  r  1

r  0.85  0.92
Multiple Regression Analysis

Model: Y  a  b1 X 1  b2 X 2    bk ' X k '


Multiple Regression Analysis

Adjusted Coefficient of Determination

(n  1)
R 2  1  (1  R 2 )
(n  k )
Multiple Regression Analysis

Analysis of Variance and F Statistic

Explained Variation /(k  1)


F
Unexplained Variation /(n  k )

R 2 /(k  1)
F
(1  R 2 ) /(n  k )
Problems in Regression Analysis

 Multicollinearity: Two or more explanatory


variables are highly correlated.
 Heteroskedasticity: Variance of error term is not
independent of the Y variable.
 Autocorrelation: Consecutive error terms are
correlated.
Durbin-Watson Statistic

Test for Autocorrelation


n

 t t 1
( e  e ) 2

d t 2
n

t
e 2

t 1

If d=2, autocorrelation is absent.


Steps in Demand Estimation

 Model Specification: Identify Variables


 Collect Data
 Specify Functional Form
 Estimate Function
 Test the Results
Functional Form Specifications

Linear Function:
QX  a0  a1 PX  a2 I  a3 N  a4 PY    e

Power Function: Estimation Format:

QX  a( PXb1 )( PYb2 ) ln QX  ln a  b1 ln PX  b2 ln PY
Thank you

You might also like