# Taylor series

(diff) ←Older revision | Current revision (diff) | Newer revision→ (diff)

In mathematics, the Taylor series is a representation of a function as an infinite sum of terms calculated from the values of its derivatives at a single point. It is common practice to use a finite number of terms of the series to approximate a function.

# Unidimensional Taylor Series

The Taylor series of a function ƒ(x) that is infinitely differentiable in the neighbourhood of the point $\mathbf{x}_0$ is the power series

$f(a)+\frac {f'(a)}{1!} (x-a)+ \frac{f''(a)}{2!} (x-a)^2+\frac{f^{(3)}(a)}{3!}(x-a)^3+ \cdots$

which in a more compact form can be written as

$\sum_{n=0} ^ {\infin } \frac {f^{(n)}(a)}{n!} \, (x-a)^{n},$

# Multidimensional Taylor Series

A second-order Taylor series expansion of a scalar-valued function of more than one variable can be written compactly as

$T(\mathbf{x}) = f(\mathbf{x}_0) + \nabla f(\mathbf{x}_0)(\mathbf{x} - \mathbf{x}_0) + \frac{1}{2!} (\mathbf{x} - \mathbf{x}_0)^T \,\nabla^2 f(\mathbf{x}_0)\,(\mathbf{x} - \mathbf{x}_0) + \cdots\! \,,$

where $\nabla f(\mathbf{x}_0)\!$ is the gradient of $\,f$ evaluated at $\mathbf{x} = \mathbf{x}_0$ and $\nabla^2 f(\mathbf{x}_0)\!$ is the Hessian of $\,f$ evaluated at $\mathbf{x} = \mathbf{x}_0$.