0

Take $f(x) = x ^ 2$ and $p = 3$ for a simplicity sake. Then $f'(x)$ allows to define a linear approximation of the $f(p)$ in the following way: $l(x) = f'(p)(x - p) + f(p)$. Often enough it is stated that $l(x)$ is the best linear approximation one could possible find, however I've never seen how one compares some linear approximation to another to be able to deduce which is better?

For example, take $l(x) = 3x$ instead. It does approximate $f(x)$ at the $p$ perfectly: $f(p) = 3 ^ 2 = 9 = l(p) = 3 \times 3$, and also it's error function $e(x) = f(p) - l(x)$ does approach $0$ given that $x$ approaches $p$ either way.

So both of them yield the same perfect approximation at the point $p$, both offer arbitrary little errors for some $(p - \delta, p + \delta), \delta > 0$. How do I convince myself then that one is indeed better then the other?

Zazaeil
  • 1,518

2 Answers2

1

The derivative $f^\prime$ of a function $f$ satisfies the requirement that $$f(x+h) - f(x) = f^\prime(x)h + o(|h|)$$ where $o$ is a function which satisfies $\lim_{h\rightarrow 0} \frac{o(|h|}{|h|} = 0.$

Geometrically this means that the tangent to the graph of $f$ at $x$ has slope $f^\prime(x)$.-

I suggest you convince yourself of the fact that your function $l(x)$ does not have this property with respect to $f(x) = x^2$ in $x=3$. The statement you wrote down about $e$ approaching $0$ in $x$ (it only works for $x=3$, btw) is just the statement that both functions $f$ and $l$ are continuous in $x=3$ and evaluate to the same value (their graphs intersect, but graph of the linear function $l$ is not tangent to the one of $f$ in that point). For an answer to your question you should vist the question for which PrincessEev provided a link in a comment

Thomas
  • 22,361
1

Let us consider a function $f$ which is continuous at $p$ and a linear function $l(x) = ax +b$. When should we regard $l$ as a linear approximation of $f$ at $p$?

Let us introduce the approximation error function $$\varepsilon(x) = f(x) - l(x) \tag{1} $$ which is continuous at $p$. With some justification we may call $l$ a linear approximation of $f$ at $p$ if $\varepsilon(p) = 0$. That is, we require $l(p) = f(p)$ which results in $$l(p) = a(x-p) + f(p) . \tag{2}$$ There are infinitely many linear function of this form, but does one of them deserve to be called the best linear approximation? This requires to define what it means to be an optimal linear approximation. A reasonable approach is to look at the relative approximation error $$\rho(h) = \frac{\varepsilon(p+h)}{h} \tag{3}$$ which is defined for $h \ne 0$. The best what can happen is that $$\lim_{h \to 0} \rho(h) = 0 . \tag{4}$$

This means $$\lim_{h \to 0} \frac{f(p+h) - l(p+h)}{h} = \lim_{h \to 0} \frac{f(p+h) - f(p) - ah}{h} = 0 .\tag{5} $$ In other words, $(4)$ is satisfied if and only if $a$ has the property $$a = \lim_{h \to 0} \frac{f(p+h) - f(p)}{h} .\tag{6} $$ This means that $f$ is differentiable at $p$. Precisely in this case $f$ has a linear approximation at $p$ such that the relative approximation error $\rho(h)$ goes to $0$ when $h$ goes to $0$. As we have seen there is only one such linear approximation; it is $$l(x) = f'(p)(x-p) + f(p) . \tag{7}$$