2

The Taylor series of a real/complex-valued function $f(x)$ that is infinitely differentiable at real/complex value $a$ is as follows:

$$f(x) = \sum_{n=0}^{\infty}\frac{f^{(n)}(a)}{n!}(x-a)^n$$

I know we use this in a lot of places but I have no idea why they're essential. If we are capable of taking derivatives of a function at a point, and we are capable of evaluating the function at a point, why do we need a summation or power series that only gives an approximation?

user525966
  • 5,631

2 Answers2

4

From a theoretical standpoint, some times a Taylor series is all you have for a function. For instance, many differential equations can be solved by induction on the degree of the Taylor series.

As for practical uses, when you ask a computer to evaluate $\sin(4.3)$, say, then evaluating the Taylor series of the sine function with $x = 4.3$ up to some predetermined degree is what the computer actually does, because multiplication and addition is very easy for them (most modern processors have built-in special-purpose multiplying and adding circuits that do this really quickly; this is what FLOPS measures).

Arthur
  • 199,419
  • Is evaluating such a function like $\sin(4.3)$ only possible with Taylor series? – user525966 Feb 22 '18 at 07:56
  • @user525966 No, but for a computer it is by far the fastest way to do it, and probably the same is true for humans. – Arthur Feb 22 '18 at 07:57
  • What would we have to do without Taylor series for sines and exponentials etc? – user525966 Feb 22 '18 at 07:58
  • @user525966 For sine, you could draw it geometrically and measure, for instance. For exponentials, you can evaluate $e^{\pi}$ by calculating $e^3, \sqrt[10]{e^{31}}, \sqrt[100]{e^{314}}$ and so on (just multiplication of $e$ with itself and taking roots here), until you get the precicion that you want. So there are alternatives. There are also likely other series than the Taylor series you could use. However, all of these approaches are specific to each case and not a general technique, while Taylor series always work (as long as it converges). – Arthur Feb 22 '18 at 08:00
  • So by using Taylor series how do we know how accurate it is or if the result is trustworthy? If it's just an approximation? – user525966 Feb 22 '18 at 08:05
  • 1
    @user525966 If you read the full wording of Taylor's theorem, it gives you a pretty good estimation of the error from taking finitely many terms. (It actually tells you exactly what the error is, in some sense, but most of the time it's impossible to calculate so you estimate that error term, and get an upper bound for the error, which most of the time is all you need.) – Arthur Feb 22 '18 at 08:07
  • 1
    Calculators use CORDIC usually, not Taylor series – Yuriy S Feb 22 '18 at 08:23
  • @YuriyS I didn't know that. Cool. – Arthur Feb 22 '18 at 08:42
3

The usual situation:

You "know" $f$, $f'$, $f''$, $\dots$

But you don't know how to calculate $f(x)$ for any point $x$.

But you know how to calculate $f(a)$ for some concrete point(s) $a$.

You can write the Taylor series at $a$.

You can approximate $f(x)$ at least for $x$ near $a$.

Example: $f(x) = e^x$. $$\forall n\in\Bbb N: f^{(n)} = f,$$ $$\forall n\in\Bbb N: f^{(n)}(0) = f(0) = e^0 = 1,$$ $$f(x) = \sum_{n=0}^\infty\frac{x^n}{n!}.$$