../_images/book_cover.jpg

This notebook contains an excerpt from the Python Programming and Numerical Methods - A Guide for Engineers and Scientists, the content is also available at Berkeley Python Numerical Methods.

The copyright of the book belongs to Elsevier. We also have this interactive book online for a better learning experience. The code is released under the MIT license. If you find this content useful, please consider supporting the work on Elsevier or Amazon!

< 20.2 Finite Difference Approximating Derivatives | Contents | 20.4 Numerical Differentiation with Noise >

Approximating of Higher Order Derivatives

It also possible to use Taylor series to approximate higher order derivatives (e.g., \(f''(x_j), f'''(x_j)\), etc.). For example, taking the Taylor series around \(a = x_j\) and then computing it at \(x = x_{j-1}\) and \(x_{j+1}\) gives

\[ f(x_{j-1}) = f(x_j) - hf^{\prime}(x_j) + \frac{h^2f''(x_j)}{2} - \frac{h^3f'''(x_j)}{6} + \cdots\]

and

\[f(x_{j+1}) = f(x_j) + hf^{\prime}(x_j) + \frac{h^2f''(x_j)}{2} + \frac{h^3f'''(x_j)}{6} + \cdots.\]

If we add these two equations together, we get

\[f(x_{j-1}) + f(x_{j+1}) = 2f(x_j) + h^2f''(x_j) + \frac{h^4f''''(x_j)}{24} + \cdots,\]

and with some rearrangement gives the approximation $\(f''(x_j) \approx \frac{f(x_{j+1}) - 2f(x_j) + f(x_{j-1})}{h^2},\)\( and is \)O(h^2)$.

< 20.2 Finite Difference Approximating Derivatives | Contents | 20.4 Numerical Differentiation with Noise >