7

This question is intended to kind of rekindle this old question, which was apparently very hard didn't receive a satisfactory answer. I'm aware that the hope of a definite answer is quite slim, but I'm still very curious to know:

Suppose $f:U\to \Bbb R$ is a function, $U$ is an open subset of $\Bbb R^n$, and $x_0\in U$, and the partial derivatives $\partial_i|_{x_0}f,\,i=1,\cdots,n$ exist. Then what regularity conditions on $\partial_i|_{x_0}f,\,i=1,\cdots,n$ are a sufficient and necessary condition for $f$ to be differentiable at $x_0$?

The mere existence of $\partial_i|_{x_0}f$ is far from enough. The continuity of all of them is over-sufficient. The weakest sufficient condition AFAIK is this one, i.e. the continuity of all but one of them, which is nevertheless still over-sufficient.

Given that we can easily construct a function differentiable at $x_0$ yet has all its partial derivatives discontinuous at $x_0$, like $$f(x,y)=\begin{cases}(x^2+y^2)\sin\left(\dfrac{1}{\sqrt{x^2+y^2}}\right) & \text{ if $(x,y) \ne (0,0)$}\\0 & \text{ if $(x,y) = (0,0)$}.\end{cases}$$ this problem I believe is intrinsically hard.

Has any research been done that can shed some light on this complicated problem?

Vim
  • 13,640

2 Answers2

4

This is not a direct answer to your question, but I searched high and low in Dieudonné's Treatise on Analysis, and he gives the following exercise (in the Banach space setting, but no different):

Suppose $f\colon\Bbb R^2\to\Bbb R$ is continuous. $f$ is differentiable at $(a,b)$ if and only if

  1. The partial derivatives of $f$ exist at $(a,b)$.
  2. For every $\epsilon>0$ there is $\delta>0$ so that $|s|,|t|<\delta$ imply $$|f(a+s,b+t)-f(a+s,b)-f(a,b+t)+f(a,b)|\le \epsilon(|s|+|t|).$$

(He goes on to remark that the latter condition is satisfied if, say, $\dfrac{\partial f}{\partial x}(a,b)$ exists and there's a neighborhood $V$ of $(a,b)$ so that $\dfrac{\partial f}{\partial y}$ exists on $V$ and is continuous.)

Ted Shifrin
  • 115,160
  • The final remark you make, I seem to need only the information about existence and the continuity of the y derivative to obtain condition 2. Am I missing something ? Why did I not need the x partial derivative information at all ? – me10240 Oct 26 '18 at 19:54
  • @me10240: Yes, but note this is only an "if" and not an "only if" statement. This is a standard result. You do the standard argument, being sure to vary $x$ first and then vary $y$. – Ted Shifrin Oct 26 '18 at 20:15
  • I am sorry, I was lost in your second remark. I used MVT for first two terms, then adjusted with $tD_yf(a,b)$. Then I used continuity of $D_yf(a,b)$ for one term, and existence of $D_y(a,b)$ for other. So I do not understand what you mean by vary $x$. – me10240 Oct 26 '18 at 20:23
  • If you use the MVT you'll need the existence of $D_xf(x,b)$ for all $x$. Just use the linear approximation at $x=a$ to estimate $f(a+s,b)-f(a,b)$. – Ted Shifrin Oct 26 '18 at 20:26
  • $f(a+s, b+t)-f(a+s,b) = tD_y(a+s, y_1)$, that is also wrt y derivative and its existence in V. Thereafter, I estmate $| D_y(a+s, y_1) - D_y(a, b)|$ from continuity of y derivative at $(a,b)$, and $f(a, b+t) - f(a,b) -tD_y(a,b)$ from existence of y derivative. My point was, am I making a mistake somewhere, since I never used any information about $D_xf$ , everything I used was to do with $D_yf$? – me10240 Oct 26 '18 at 20:35
  • @me10240: If you pay careful attention, you don't get the precise estimate in the statement. What are the errors in your estimates? – Ted Shifrin Oct 27 '18 at 00:50
  • I did get only $\epsilon(|t|)$ on RHS, but I thought that is sufficient. Could you give me a hint on where I went wrong, and how to do it correctly ? – me10240 Oct 27 '18 at 00:58
  • I wanted make my question clear in case we have been talking at cross-purposes. If am given as you said, that ∂f∂x(a,b) exists and there's a neighborhood V of (a,b) so that ∂f∂y exists on V and is continuous, I am NOT trying to prove differentiability of f under the given conditions. I was trying to derive condition 2. that you mentioned above from Dieudonne. – me10240 Oct 27 '18 at 02:36
  • @me10240: Yes, I understand. I am very suspicious about your proof, but I understand. I'm thinking about possible counterexamples. – Ted Shifrin Oct 27 '18 at 17:06
  • @me10240: As I suspected, your control over the $y$-partial derivative may be very badly behaved with respect to $x$. Try your proof carefully with $f(x,y) = \sqrt{|x|}(y+1)$ at the origin. – Ted Shifrin Oct 27 '18 at 20:30
  • I was adding too many questions as comments, so I have asked a fresh question here- https://math.stackexchange.com/questions/2974295/a-necessary-condition-for-differentiability-in-multivariable-calculus Please take a look, it would be nice if you could write your explanation as an answer there. – me10240 Oct 28 '18 at 05:38
2

I also am not aware of any necessary and sufficient condition on partial derivatives that imply differentiabiliy and the theorem in the link is the best result I know. One thing that it is important to observe is that differentiability at $x_0$ implies that all directional derivatives $\frac{\partial f}{\partial v}(x_0)$ exist and that $$ \frac{\partial f}{\partial v}(x_0)=\sum_{i=1}^n \frac{\partial f}{\partial x_{i}} (x_0)v_i $$ for every $v\in\mathbb{R}$. However, this condition is only necessary and not sufficient. However, if $f$ is Lipschitz continuous, then it becomes also sufficient. This is all I know about this.

Gio67
  • 20,905
  • By Lipschitz do you mean pointwise Lipschitz condition at $x_0$ or a global one on an open nbhd? Also would you kindly provide some references about your last claim? – Vim May 01 '17 at 02:36
  • I was thinking in an open neighborhood but maybe a local one $|f(x)-f(x_0)|\le L|x-x_0|$ for all $x$ near $x_0$ could be enough. – Gio67 May 01 '17 at 02:42