2.4. Taylor Series Expansion#
Note
Important things to retain from this block:
Understand how to compute Taylor series expansion (TSE) of a given function around a given point and its limitations
Understand how numerical derivatives are derived from TSE
Understand that the magnitude of the error depends on the expression used
Things you do not need to know:
Any kind of Taylor expansion of a function by heart
Definition#
The basic idea behind Taylor series is to approximate any kind of function in the neighborhood of a certain point with a polynomial. This approximation is useful because polynomials are “friendly” functions: it is easy to evaluate them, compute their derivatives or integrals.
The Taylor series expansion (TSE) of an arbitrary function \(f(x)\) around \(x=x_i\) is a polynomial given by
Let’s have a look at the structure and meaning of the individual terms.
The first terms just equals the value of the function that we want to approximate. The following terms include derivatives of the function with increasing order.
The second term makes the slope of the polynomial equal to the slope of our function \(f\). (You can verify this by taking the first derivative of the Taylor series expansion with respect to \(x\) and evaluating it at \(x_i\).)
Each additional term sets a higher derivative of the Taylor series equal to the derivative of the function we want to approximate.
The interactive plot below visualizes this concept. It shows the Taylor series approximation of a sine function for different \(x_i\), with a varying number of terms. Change the order of the Taylor series approximation and the value of \(x_i\) and answer the following questions:
How well does the polynomial approximate \(\sin(x)\)?
How does the result depend on the number of terms used?
How does it vary with distance from \(x_i\)?
import numpy as np
import math
import matplotlib.pyplot as plt
from ipywidgets import widgets, interact
def sine_derivative(x, n):
"Compute the nth derivative of sin(x)"
match np.mod(n, 4):
case 0:
return np.sin(x)
case 1:
return np.cos(x)
case 2:
return -np.sin(x)
case 3:
return -np.cos(x)
def taylor_poly_sine(x, xi, order):
"Compute the Taylor series expansion of sin(x) around xi"
return sum(
(x - xi) ** n / math.factorial(n) * sine_derivative(xi, n)
for n in range(order + 1)
)
def taylor_plot(order, xi):
x = np.linspace(-2 * np.pi, 2 * np.pi, 100)
tse = taylor_poly_sine(x, xi=xi, order=order)
plt.plot(x, np.sin(x), label="sin(x)")
plt.plot(x, tse, label=f"Taylor series expansion (order {order})")
plt.scatter(xi, np.sin(xi), marker="o", color="k")
plt.annotate("$x_i$", xy=(xi + 0.2, np.sin(xi) - 0.1))
plt.ylim(-5, 5)
# plt.axis("off")
plt.legend(loc="upper right")
plt.show();
interact(
taylor_plot,
order=widgets.IntSlider(
value=0,
min=0,
max=10,
step=1,
description="order",
),
xi=widgets.FloatSlider(
value=0,
min=-2,
max=2,
step=0.1,
description="xi",
),
);
Relevant conclusions
The 1st order, which depends only on the first derivative evaluation, is a straight line.
The more terms used (larger order) the better we can approximate the function.
The further from the starting point (e.g., in the plots \(x_i=0\)), the larger the error.
The Taylor series expansion is exact as long as we include infinite terms. We, however, are limited to a truncated expression: an approximation of the real function. For example, we can write the Taylor series approximation with only 3 terms as:
Here, we defined \(\Delta x=x-x_i\), where \(\Delta x\) is the distance between the point we “know” and the desired point. \(\mathcal{O}(\Delta x^4)\) means that we do not take into account the terms associated to \(\Delta x^4\) and therefore that is the truncation error order. From here we can also conclude that the larger the step \(\Delta x\), the larger the error!
Tip
We will use \(\Delta x\) more frequently from this point on, so it is good to recognize now, using the equations above, that it is a different way of representing the differential increment, for example, \(f(x_{i+1})=f(x_i+\Delta x)\) or \(f(x_{i+2})=f(x_i+2\Delta x)\).
Now let’s see some examples how we can use the Taylor series expansion to approximate specific functions.
Compute \(e^{0.2}\) using 3 terms of TSE around \(x_i=0\).
Solution
We want to evaluate \(e^x=e^{x_i+\Delta x}=e^{0.2}\). Therefore, \(\Delta x=0.2\). The value of \(f(x_i=0)=e^0=1\). For the case of this exponential, the derivatives have the same value \(f'(x_i=0)=f''(x_i=0)=f'''(x_i=0)=e^0=1\). The TSE looks like this:
Compute the TSE polynomial truncated until \(\mathcal{O}(\Delta x^6)\) for \(f(x)=\sin(x)\) around \(x_i=0\).
Solution
Applying the definition of TSE:
Note that \(x\) and \(\Delta x\) can be written interchangeably when \(x_i=0\) as \(\Delta x=x-x_i\). If this would not be the case, for example if \(x_i=\pi\) the result is completely different.
Compute the TSE polynomial truncated until \(\mathcal{O}(\Delta x^6)\) for \(f(x)=\sin(x)\) around \(x_i=\frac{\pi}{2}\).
Solution
Applying the definition of TSE:
Use of TSE to define the first derivative#
Forward and backward difference#
You may have noticed that for the Taylor series expansion we only ever need to evaluate our function \(f\) and its derivatives at the point \(x_i\). Let’s see how we can use this to approximate the value of the function at a point \(x_{i+1} = x_i + \Delta x\) that is just a little bit to the right of \(x_i\). This will help us to come up with an approximation for \(f'(x_i)\).
The Taylor series expansion for evaluated at \(x_{i+1}\) is:
Now, we can solve this expression for the first derivative:
By truncating the derivative to avoid the calculation of the second derivative, we find the forward difference
This is the same expression you saw for the forward numerical derivative in Section 2.2! Now we have the added knowledge that this comes with an error of the order of the step.
In a similar way, we can derive an expression for the backward difference. Use a Taylor series approximation to derive the backward difference for the first derivative \(f'(x_i)\) with a first order error \(\mathcal{O}(\Delta x)\).
Solution
We can get an expression for the backward difference by evaluating the Taylor series expansion at the point \(x_{i-1} = x_i - \Delta x\).
Central difference#
The central difference can be found by summing the forward and backward difference expressions of the derivative and dividing it by 2:
Forward difference:
Backward difference:
Summing both terms yields:
After dividing by 2 and simplifying, we get:
Note that the second derivative terms cancel each other out, therefore the order error is the step size squared! This means that the central difference is more accurate than the forward or backward difference. You can notice it as well intuitively in the figure of the previous chapter.
TSE to define second derivatives#
There are equations that require second derivatives. One example is the diffusion equation. The 1-D diffusion equation reads:
where \(v\) is the diffusion coefficient. For the moment we will use TSE to find only a numerical expression of the second derivative \(\frac{\partial^2 f}{\partial x^2}\).
The procedure is simple but cumbersome. The general idea is to isolate the second derivative in the TSE without there being a dependency on other derivatives. Below you can find more details about the algebraic manipulation (if you are curious) but you do not need to know it. Here is the result:
This is the forward difference approximation of the second derivative. Two aspects come to mind:
one additional point is needed to compute the second derivative and
the error (of the simplest second derivative) is also of the order of the step.
There are also backward and central approximations of the second derivative (not shown here).
Deriving the forward difference approximation for the second derivative
First define a TSE for a point two steps away from \(x_i\):
Now multiply the TSE by two for a point one step away from \(x_i\):
By subtracting the first expression from the second one the first derivative disappears:
By solving for \(f''\) we obtain the forward expression:
Higher-accuracy Finite-Difference Approximations#
So far we have found expressions with a relative large magnitude error \(\mathcal{O}(\Delta x)\) with the exception of the central difference approximation. Sometimes a higher accuracy is desired for which better expressions can be found. The procedure is similar to the algebraic manipulation to find the forward approximation of the second derivative: a series of TSE are defined at varying distances from \(x_i\) and after algebraic manipulation a more accurate expression is found. For example, the forward approximation of the first derivative:
The error magnitude has improved to \(\mathcal{O}(\Delta x^2)\) at the expense of using one more point. The accuracy can be even better by using more points. It is important to note that central differences are more accurate than forward and backward differences when using the same number of points.
The following exercise is about deriving the expression for the forward approximation with higher accuracy.
Derive a first derivative \(f('x)\) with an 2nd error order \(\mathcal{O}(\Delta x^2)\) finite-difference equations, using the nodes \(x_i\), \(x_i+\Delta x\) and \(x_i+2\Delta x\) or \(x_i\), \(x_{i+1}\) and \(x_{i+2}\).
The finite-difference equations will have the following form:
Use the Taylor of series of \(f(x_i+\Delta x)\) and \(f(x_i+2\Delta x)\) to find \(\alpha\), \(\beta\) and \(\gamma\).
Tip: You want to find an expression for the first order derivative with a second order error , \(1+2=3\). This means you need to truncate the Taylor series expansion until third order.
Solution
Taylor series for \(f(x_i+\Delta x)\) and \(f(x_i+2\Delta x)\) :
Multiply \(f(x_i+\Delta x)\) by 4 and expand the term \(\frac{(2\Delta x)^2}{2!}f''(x_i)\) in TSE \(f(x_i+2\Delta x)\):
Now take \(4f(x_i+\Delta x)-f(x_i+2\Delta x)\):
Rearrange for \(f'(x_i)\):
Divide by \(-2 \Delta x\):
The solution is: \(\alpha=-\frac{3}{2}\), \(\beta= 2\), \(\gamma= -\frac{1}{2}\)
Attribution
This chapter is written by Jaime Arriaga Garcia, Anna Störiko, Justin Pittman and Robert Lanzafame. Find out more here.