stat 7 modify
stat 7 modify
13
Regression – definition – fitting of simple linear regression equation – testing
the significance of the regression coefficient
Regression
Regression is the functional relationship between two variables and of the two
variables one may represent cause and the other may represent effect. The variable
representing cause is known as independent variable and is denoted by X. The variable X
is also known as predictor variable or repressor. The variable representing effect is
known as dependent variable and is denoted by Y. Y is also known as predicted variable.
The relationship between the dependent and the independent variable may be expressed
as a function and such functional relationship is termed as regression. When there are
only two variables the functional relationship is known as simple regression and if the
relation between the two variables is a straight line I is known a simple linear regression.
When there are more than two variables and one of the variables is dependent upon
others, the functional relationship is known as multiple regression. The regression line is
of the form y=a+bx where a is a constant or intercept and b is the regression coefficient
or the slope. The values of ‘a’ and ‘b’ can be calculated by using the method of least
squares. An alternate method of calculating the values of a and b are by using the
formula:
The regression equation of y on x is given by y = a + bx
and a= –b
The regression line indicates the average value of the dependent variable Y associated
with a particular value of independent variable X.
1
Assumptions
tan θ
To test the significance of the regression coefficient we can apply either a t test or
analysis of variance (F test). The ANOVA table for testing the regression coefficient will
be as follows:
2
Deviation from regression n-2 SS(Y)-SS(b) Se 2
Total n-1 SS(Y)
The regression analysis is useful in predicting the value of one variable from the
given values of another variable. Another use of regression analysis is to find out the
causal relationship between variables.
Uses of Regression
The regression analysis is useful in predicting the value of one variable from the
given value of another variable. Such predictions are useful when it is very difficult or
expensive to measure the dependent variable, Y. The other use of the regression analysis
is to find out the causal relationship between variables. Suppose we manipulate the
variable X and obtain a significant regression of variables Y on the variable X. Thus we
can say that there is a causal relationship between the variable X and Y. The causal
relationship between nitrogen content of soil and growth rate in a plant, or the dose of an
insecticide and mortality of the insect population may be established in this way.
Example 1
From a paddy field, 36 plants were selected at random. The length of panicles(x) and the
number of grains per panicle (y) of the selected plants were recorded. The results are
given below. Fit a regression line y on x. Also test the significance (or) regression
coefficient.
The length of panicles in cm (x) and the number of grains per panicle (y) of paddy plants.
3
8 124 24.0 20 103 21.4 32 145 24.0
9 137 24.9 21 114 23.3 33 85 20.6
10 90 20.0 22 124 24.4 34 94 21.0
11 107 19.8 23 143 24.4 35 142 24.0
12 108 22.0 24 108 22.5 36 111 23.1
=a+ b
115.94 = a + (11.5837)(22.86)
a=115.94-264.8034
a=-148.8633
The fitted regression line is y =-148.8633+11.5837x
4
Anova Table
For t-test
Table Value:
Hence t is significant.
Questions
1. When the correlation coefficient r = +1, then the two regression lines
a) are perpendicular to each other b) coincide
c) are parallel to each other d) none of these
Ans: coincide
5
2. If one regression coefficient is greater than unity then the other must be
a) greater than unity b) equal to unity
c) less than unity d) none of these
Ans: less than unity