0% found this document useful (0 votes)
9 views

Lecture 17 - Curve Fitting Examples

basics of curve fitting and the use of matplot libriry

Uploaded by

mswelizinhle251
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Lecture 17 - Curve Fitting Examples

basics of curve fitting and the use of matplot libriry

Uploaded by

mswelizinhle251
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Lecture 17 - Curve fitting Examples

Polynomial interpolation
Consider the example in the textbook on p.70.

Instead of calculating the polynomial manually, Python can do that using numpy.polyfit (or np.polyfit
if it is imported as usual).

There are 3 parameters that we need to give to polyfit :

the x-values (a list)


the y-values (a list)
the degree of the polynomial, in this case n − 1.

The function returns the coefficients of the polynomial in decreasing order:


3 2
p0 x + p1 x + p2 x + p3 .

In [ ]: import numpy as np
import matplotlib.pyplot as plt

x=[-1,0,2,3]
y=[2,1,-2,0]

p_coef=np.polyfit(x,y,3)

print(p_coef)

plt.plot(x,y,'ro')

# Now lets use p_coef to also sketch the polynomial


#X=
#f= lambda X:

#plt.plot(X,f(X))

Compare this to the calculated values in the textbook, where the formula was used.

There is an easier way to set up the polynomial in order to sketch it. (If it is a higher order polynomial, we
don't want to manually make a lambda function for this!)

We can use np.poly1d() . The function takes a list of coefficients (in decreasing order) and returns a
polynomial. This is basicallly the same as the lambda function, but it is a built in function. You can use it the
same way you would use a lambda function.

Try this for the above example.

In [ ]: p=np.poly1d(p_coef)
print(p)
plt.plot(X,p(X))
Least squares polynomial fit
We still use numpy.polyfit , but this time we use another parameter, instead of n − 1. (You can call the
help function on polyfit , but it is set up to find the least squares polynomial.)

Consider the example in the textbook. First, we do polynomial interpolation again (so use n − 1).

In [ ]: x=[0.5, 0.7, 0.9, 1, 1.2, 1.3, 1.4, 1.6, 1.9, 2, 2.3, 2.4, 2.7, 2.8, 2.9,
3]
y=[2.2, 2.3, 1.5, 1.7, 0.9, 1.1, 0.8, 0.6, 0.5, 0.6, 1.2, 1.3, 2.9, 3.6, 4.
7, 5.4]

plt.plot(x,y,'ro')

n=len(x)
p_coef=np.polyfit(x,y,n-1)
p=np.poly1d(p_coef)
print(p)

X=np.linspace(0,3.2,100)
plt.plot(X,p(X))

We cannot really see what happens between the points on the graph. To see the behaviour, we need to
adjust the y-axis. (The other way of plotting is necessary to adjust this.)

In [ ]: fig, ax = plt.figure(), plt.axes()


ax.plot(x,y,'ro')
ax.set_ylim(-5,6)
ax.plot(X,p(X))
plt.show()

Is this a good approximation of the given data?

What polynomial seems like a better fit?

In [ ]:

In [ ]:

You might also like