Chapter IV
Chapter IV
Curve Fitting:
• Introduction,
• Least square Regression;
• Interpolations;
• Fourier Approximations
1
Introduction
• The process of finding the equation of the curve of best fit,
which may be most suitable for predicting the unknown
values, is known as curve fitting. Therefore, curve fitting
means an exact relationship between two variables by
algebraic equations. There are following methods for fitting a
curve:
I. Graphic method
II. Method of group averages
III. Method of moments
IV. Principle of least square.
• Out of above four methods, we will only discuss and study
here principle of least square
2
Least square Regression;
• The method of least square is probably the most systematic procedure to fit a
unique curve through the given data points.
3
Least square Regression;
• be fitted to the set of n data points ( ), ( ), ( ),….. ( ). At (x = ) the observed
(or experimental) value of the ordinate is and the corresponding value on
the fitting curve (i ) is a + b + c +…k = which is the expected or calculated
value. The difference of the observed and the expected value is = (say) this
difference is called error at (x = ) clearly some of the error , , ’…. will be
positive and other negative. To make all errors positive we square each of
the errors i.e., S = + + + ….. +…. the curve of best fit is that for which e’s
are as small as possible i.e. S, the sum of the square of the errors is a
minimum this is known as the principle of least square.
4
Least square Regression;
Fitting of Straight Line
• Let a straight line y= a + bx ...(1)
• Which is fitted to the given date points ( ), ( ), ( ),….. ( )
• Let be the theoretical value for then
5
Least square Regression;
The equation ( 3) and ( 4) are known as normal equations.
On solving equations ( 3) and ( 4) , we get the value of a and b. Putting the value
of a and b in equation ( 1) , we get the equation of the line of best fit.
Fitting of Parabola
• Let a straight line y= a + bx + c...(1)
• Which is fitted to the given date points ( ), ( ), ( ),….. ( )
• Let be the theoretical value for then
y= a + bx + c
Normal equation are = + +
=
= +b + =a+b +
=a + + c
6
The equation, (3), (4) and (5) are known as normal equations.
On solving equations (3), (4) and (5), we get the value of a, b and c. Putting the
value of a, b and c in equation (1), we get the equation of the parabola of best fit.
7
Example 1. Find the best-fit values of a and b so that y = a + bx fits the data given
in the table.
8
Putting these values in normal equations we get,
16.9 = 5a +b10
47.1 = 10a+ b30
On solving these two equations we get, a =0.72, b = 1.33.
So required line y = 0.72 + 1.33x.
9
Putting all values in the equations (2 ) and (3), we get
3060 = 6a + b21
6450 = 21a + 91b
Solving these equations, we get
a = 1361.97 and b = −243.42
hence the fitted equation is y = 1361.97 – 243.42x.
Example 3 . Fit a second-degree parabola to the following data
taking x as the independent variable.
11
(1 ) and solving the equations for a, b, and c; we get
a = −0.923; b = 3.520; c = −0.267.
Hence the fitted equation is y= −0.923 + 3.53x − 0.267 2x.
12
Interpolations;
13
Interpolations;
15
Interpolations;
16
Interpolations;
17
Interpolations;
18
Interpolations;
19
Interpolations;
20
Interpolations;
21
Fourier approximation
Introduction
One important application of linear algebra is in the representation
and approximation of functions. For example, suppose E(t) is a
function that represents the amount of acoustical energy received
by a microphone at time t. Since sound energy is in the form of
sound waves of varying frequencies, it seems plausible to represent
E(t) as a linear combination of simple waves. In mathematics,
simple waves are represented using sine and cosine functions of the
form sin(kt) and cos(kt), where k is a nonnegative integer, and the
greater the k value, the higher the wave frequency. Then
23
Fourier approximation
24