13

The question I'd like to ask is this:

If $f''(0)$ exists, does $f'$ exist in a neighborhood of $0$?

Of course, under the standard definition of $f''(0)$, we have already assumed that $f'$ exists in a neighborhood of $0$. So instead:

Question: Is there a standard way to define $f''(0)$ as a limit expression that does not include $f'$ in it, and if so, can we deduce from the fact that $f''(0)$ exists that $f'$ exists in a neighborhood of $0$?

Details

If I know what $f'(0)$ is, I can make $f''(0)$ be the constant (if it exists) such that $$ \lim\limits_{h \to 0} \frac{f(h) - \left[ f(0) + f'(0) h + \frac{1}{2} f''(0) h^2\right]}{h^2} = 0. $$ i.e., the Taylor polynomial approximates $f$ to second order. Then, I could just plug in for $f'(0)$ the expression $\frac{f(h) - f(0)}{h}$. But this doesn't work; everything cancels out. Is there a different standard way to define $f''$ without using $f'$?

I should probably put some more work into answering this myself, but first I wanted to see if this is a standard or well-known question.

  • 2
    Check out this answer : http://math.stackexchange.com/a/776271/97045 where he gets to "Comment on Bos's work", there is some consideration made towards the definition of a second derivative. – DanielV May 04 '14 at 01:26
  • @DanielV: Thank you! You are quite right in some sense. If a curve is already known to be smooth, then the limit I gave in the answer there does connect to the second derivative, though only directly for $h_1=h_2$. But I have to agree with Paramanand Singh, commenting on the answer by Mauricio G Tec, that these fractions suggest a method for calculating the second derivative rather than defining it. Perhaps such a definition is possible, perhaps not. I am still contemplating :) – String May 06 '14 at 09:38
  • Just to clarify, Leibniz and his contemporaries did not consider the issue of differentiality as a whole. They were well aware of issues regarding singularities, but a fully-fledged concept of differentiability seems to me to have been irrelevant to the concepts of the 17th century mathematics. They only studied fairly well behaved relations, after all. – String May 06 '14 at 09:44

3 Answers3

15

To the question in the title the answer is Yes. One answer is often used in the theory of numerical differentiation, specifically in finite difference methods. For example:

$$ f''(x) = \lim_{h\to 0} \frac{f(x+h)-2f(x)+f(x-h)}{h^2} $$

In response to the very interesting comments: Indeed, the problem with this centered definition is that it has the same problem with defining $f'(x)$ using the expression $$f'(x)=\lim_{h\to 0} \frac{f(x+h)-f(x-h)}{2h}$$ as for any odd function this limit will be zero even if the function is not continuous. Here is another try:

$$f''(x) = \lim_{h\to 0} \frac{f(x)-2f(x+h)+f(x+2h)}{h^2}$$

I wonder if the following formula can solve this issue. In the literature of finite differences, these are known as forward difference and the former formula as a centered difference. A similar "backward" expression can be give, and of course, one expects them to be the same.

About the other question: it might be possible to prove that $f'(0)$ exists when the set of points of discontinuity of $f''$ is of measure zero in an interval containing $0$, since this characterizes Riemann-integrable functions.

Mauricio Tec
  • 2,624
  • 5
    However, from the fact that this limit exists at $x=0$ we can not conclude that $f'$ exists anywhere, let alone in a neighbourhood of $0$. – Robert Israel Mar 24 '14 at 06:11
  • 11
    In fact, $f$ might not even be continuous at $0$. Define $f$ arbitrarily on $(0,\infty)$, take $f(0) = 0$ and $f(-x) = -f(x)$. Then $\dfrac{f(0+h) - 2 f(0) + f(0-h)}{h^2} = 0$. – Robert Israel Mar 24 '14 at 06:16
  • 4
    Hmm @RobertIsrael. Perhaps if we we make two variables, $h_1$ and $h_2$, and take the limit (in $\mathbb{R}^2$) as both go to zero. – Caleb Stanford Mar 24 '14 at 06:21
  • 3
    The formulas which you have mentioned can't be used to define the derivative (first and second order). Instead they can be used to calculate derivatives on the assumption that the function is differentiable. Hence we may say that there is no way to define second derivative directly in terms of original function, but there is a way to calculate it (provided it exists) via a direct formula involving the original function. – Paramanand Singh Mar 24 '14 at 08:37
  • 1
    Indeed some additional requirements like suggested by the last comment in the answer seem to be necessary. What if we define an equivalence relation $$x\sim y\iff \exists n\in\mathbb Z:x=2^n y$$ then we may define the function $$f(x)=\max{k\in[0,1)\ \mid\ k\sim x}\cdot x^3$$ This function has $$\lim_{h\rightarrow 0}\frac{f(0)-2 f(h)+f(2h)}{h^2}=0$$ but has a very strange behaviour with infinitely many discontinuity points accumulating around $0$, I think ... – String May 03 '14 at 23:25
  • @String Indeed, I have defined a similar function here. – Caleb Stanford May 04 '14 at 05:32
  • @Goos: Very nice! Your function works much better. I did not yet succeed inventing a function discontinuous at $x=0$. What you did there was what I was getting at. Your exchange of comments with the OP there about introducing different $\Delta x$'s can be elaborated upon, but it may not be entirely obvious how. I will get back to that later! – String May 04 '14 at 07:57
  • Whichever limit will demonstrate its “gotchas” on specific non-smooth functions, but, notably, for any true  function they result in the same value of f″. – Incnis Mrsi Nov 02 '14 at 11:21
4

This is considered in section 20 of Spivak's Calculus book. The definition the OP proposes for $f''(0)$ fails because, as they already suggest in the question, we won't be able to show that $f'$ exists in a neighborhood of $0$.

The example Spivak gives is $$f(x)=\begin{cases} 0 & x\in\mathbb{Q}\\ x^3 & x\in\mathbb{R}\setminus\mathbb{Q}\end{cases},$$ and the polynomal $p(x)=0$. Then, since $$\lim_{x\rightarrow 0}\frac{f(x)-p(x)}{x^2}=0,$$the OP's definition would imply that $f''(0)=0$. This is however not true since $f$ is not differentiable anywhere but at $0$.

Ivan Burbano
  • 1,258
1

What about something like so:

The second derivative of $f(x)$ is the limit, if it exists (i.e. if the value is independent of how $h_1$ and $h_2$ go to zero):

$$f''(x)=\lim_{\begin{matrix}h_1\rightarrow 0\\h_2\rightarrow 0\end{matrix}}\frac{f(x+h_1+h_2)+f(x)-f(x+h_1)-f(x+h_2)}{h_1h_2}.$$

That is, $f''(x)$ would be the value at $(0,0)$, if it exists, of a function of two variables: $$F_x(h_1,h_2)=\frac{f(x+h_1+h_2)+f(x)-f(x+h_1)-f(x+h_2)}{h_1h_2}.$$

The Quark
  • 267
  • awesome! Thanks for reviving my old unanswered question! :) Now, can we prove this is equivalent to the usual definition of f''? In particular, can we show that if the second derivative exists, the first derivative does too? – Caleb Stanford Jul 18 '23 at 18:49
  • @CalebStanford Maybe by integrating $f''(x)$ two times and showing that it is equivalent to $f(x)$ except for an undefined linear part. And then the first integration would be $f'(x)$. This is just some rough idea. – The Quark Jul 18 '23 at 21:46