0% found this document useful (0 votes)
311 views

Numerical Differentiation

This document discusses numerical differentiation, which approximates derivatives from discrete data points. It introduces the forward, backward, and central difference formulas to approximate the first derivative. The central difference formula averages the forward and backward differences and has higher accuracy, being O(h^2) while the others are O(h). Higher-order derivatives and partial derivatives can also be approximated using central differences. Examples are provided to demonstrate approximating derivatives from tabular data and gridded function values.

Uploaded by

Gustavo Libotte
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
311 views

Numerical Differentiation

This document discusses numerical differentiation, which approximates derivatives from discrete data points. It introduces the forward, backward, and central difference formulas to approximate the first derivative. The central difference formula averages the forward and backward differences and has higher accuracy, being O(h^2) while the others are O(h). Higher-order derivatives and partial derivatives can also be approximated using central differences. Examples are provided to demonstrate approximating derivatives from tabular data and gridded function values.

Uploaded by

Gustavo Libotte
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Lecture 27

Numerical Differentiation
Approximating derivatives from data
Suppose that a variable y depends on another variable x, i.e. y = f (x), but we only know the values of f at
a finite set of points, e.g., as data from an experiment or a simulation:
(x1 , y1 ), (x2 , y2 ), . . . , (xn , yn ).
Suppose then that we need information about the derivative of f (x). One obvious idea would be to approximate f (xi ) by the Forward Difference
f (xi ) = yi

yi+1 yi
.
xi+1 xi

This formula follows directly from the definition of the derivative in calculus. An alternative would be to
use a Backward Difference
yi yi1
.
f (xi )
xi xi1
Since the errors for the forward difference and backward difference tend to have opposite signs, it would
seem likely that averaging the two methods would give a better result than either alone. If the points are
evenly spaced, i.e. xi+1 xi = xi xi1 = h, then averaging the forward and backward differences leads to
a symmetric expression called the Central Difference
f (xi ) = yi

yi+1 yi1
.
2h

Errors of approximation
We can use Taylor polynomials to derive the accuracy of the forward, backward and central difference
formulas. For example the usual form of the Taylor polynomial with remainder (sometimes called Taylors
Theorem) is
h2
f (x + h) = f (x) + hf (x) + f (c) ,
2
where c is some (unknown) number between x and x + h. Letting x = xi , x + h = xi+1 and solving for f (xi )
leads to
f (xi+1 ) f (xi ) h
f (c).
f (xi ) =
h
2
Notice that the quotient in this equation is exactly the forward difference formula. Thus the error of the
forward difference is (h/2)f (c) which means it is O(h). Replacing h in the above calculation by h gives

95

96

LECTURE 27. NUMERICAL DIFFERENTIATION

(xi+1,yi+1 )
1

backward difference
forward difference
central difference

0.8

0.6
y
0.4

0.2

(x ,y )
i i

0
(x

,y

i 1 i 1

0.2
0.2

0.2

0.4

0.6

0.8

1.2

Figure 27.1: The three difference approximations of yi .


the error for the backward difference formula; it is also O(h). For the central difference, the error can be
found from the third degree Taylor polynomials with remainder
h2
f (xi ) +
2
h2
f (xi1 ) = f (xi h) = f (xi ) hf (xi ) + f (xi )
2
f (xi+1 ) = f (xi + h) = f (xi ) + hf (xi ) +

h3
f (c1 ) and
3!
h3
f (c2 ) ,
3!

where xi c1 xi+1 and xi1 c2 xi . Subtracting these two equations and solving for f (xi ) leads to
f (xi ) =

f (xi+1 ) f (xi1 ) h2 f (c1 ) + f (c2 )

.
2h
3!
2

This shows that the error for the central difference formula is O(h2 ). Thus, central differences are significantly
better and so: It is best to use central differences whenever possible.
There are also central difference formulas for higher order derivatives. These all have error of order
O(h2 ):
yi+1 2yi + yi1
,
h2
1
f (xi ) = yi 3 [yi+2 2yi+1 + 2yi1 yi2 ] , and
2h
1
(4)
f (4) (xi ) = yi 4 [yi+2 4yi+1 + 6yi 4yi1 + yi2 ] .
h
f (xi ) = yi

Partial Derivatives
Suppose u = u(x, y) is a function of two variables that we only know at grid points (xi , yj ). We will use the
notation
ui,j = u(xi , yj )

97
frequently throughout the rest of the lectures. We can suppose that the grid points are evenly spaced, with
an increment of h in the x direction and k in the y direction. The central difference formulas for the partial
derivatives would be
1
ux (xi , yj )
(ui+1,j ui1,j ) and
2h
1
(ui,j+1 ui,j1 ) .
uy (xi , yj )
2k
The second partial derivatives are
1
(ui+1,j 2ui,j + ui1,j ) and
h2
1
uyy (xi , yj ) 2 (ui,j+1 2ui,j + ui,j1 ) ,
k
and the mixed partial derivative is
uxx (xi , yj )

1
(ui+1,j+1 ui+1,j1 ui1,j+1 + ui1,j1 ) .
4hk
Caution: Notice that we have indexed uij so that as a matrix each row represents the values of u at
a certain xi and each column contains values at yj . The arrangement in the matrix does not coincide with
the usual orientation of the xy-plane.
Lets consider an example. Let the values of u at (xi , yj ) be recorded in the matrix

5.1 6.5 7.5 8.1 8.4


5.5 6.8 7.8 8.3 8.9

(27.1)
(uij ) =
5.5 6.9 9.0 8.4 9.1
5.4 9.6 9.1 8.6 9.4
uxy (xi , yj )

Assume the indices begin at 1, i is the index for rows and j the index for columns. Suppose that h = .5 and
k = .2. Then uy (x2 , y4 ) would be approximated by the central difference
uy (x2 , y4 )

8.9 7.8
u2,5 u2,3

= 2.75.
2k
2 0.2

The partial derivative uxy (x2 , y4 ) is approximated by


uxy (x2 , y4 )

u3,5 u3,3 u1,5 + u1,3


9.1 9.0 8.4 + 7.5

= 2.
4hk
4 .5 .2

Exercises
27.1 Suppose you are given the data in the following table.
t 0 .5 1.0 1.5 2.0
y 0 .19 .26 .29 .31
a. Give the forward, backward and central difference approximations of f (1).
b. Give the central difference approximations for f (1), f (1) and f (4) (1).
27.2 Suppose values of u(x, y) at points (xi , yj ) are given in the matrix (27.1). Suppose that h = .1 and
k = .5. Approximate the following derivatives by central differences:
a. ux (x2 , y4 )
b. uxx (x3 , y2 )
c. uyy (x3 , y2 )
d. uxy (x2 , y3 )

You might also like