Let $f: \mathbb{R}^2 \rightarrow \mathbb{R}$ be a function such that the partial derivatives with respect to $x$ and $y$ exist and one of them is continuous. Prove that $f$ is differentiable.
-
Hint: show that this reduces to the case when the function depends on one variable only. – Julien Apr 25 '13 at 19:43
-
3This is an interesting question. But you should have made more efforts in asking: motivation, personal thoughts, and so on... It is currently a very hot topic on meta. A lot of people think that the questions which are plain copy/paste from homework or from a book should be banned. That's the reason of the downvotes, and of the close votes. – Julien Apr 25 '13 at 22:42
-
1If you are curious about the downvotes, see this thread and that thread. – Julien Apr 25 '13 at 22:43
2 Answers
In short: the problem reduces to the easy case when $f$ depends solely on one variable. See the greyish box below for the formula that does the reduction.
It suffices to show that $f$ is differentiable at $(0,0)$ with the additional assumption that $\frac{\partial f}{\partial x}(0,0)=\frac{\partial f}{\partial y}(0,0)=0$. First pass from $(x_0,y_0)$ to $(0,0)$ by considering the function $g(x,y)=f(x+x_0,y+y_0)$. Then work on $h(x,y)=g(x,y)-x\frac{\partial f}{\partial x}(0,0)-y\frac{\partial f}{\partial y}(0,0)$.
So let us assume assume that $\frac{\partial f}{\partial x}$ exists and is continuous on $\mathbb{R}^2$ (only continuity in an open neighborhood of $(0,0)$ is really needed for the local argument), that $\frac{\partial f}{\partial y}$ exists at $(0,0)$, and that $\frac{\partial f}{\partial x}(0,0)=\frac{\partial f}{\partial y}(0,0)=0$. We need to show that $f$ is differentiable at $(0,0)$. Note that the derivative must be $0$ given our assumptions.
Now observe that for every $x,y$, we have, by the fundamental theorem of calculus:
$$ f(x,y)=f(0,y)+\int_0^x \frac{\partial f}{\partial x}(s,y)ds. $$
I let you check properly that $(x,y)\longmapsto f(0,y)$ is differentiable at $(0,0)$ with zero derivative, using only $\frac{\partial f}{\partial y}(0,0)=0$. For the other term, just note that it is $0$ at $(0,0)$ and that for every $0<\sqrt{x^2+y^2}\leq r$ $$ \frac{1}{\sqrt{x^2+y^2}}\Big|\int_0^x \frac{\partial f}{\partial x}(s,y)ds\Big|\leq \frac{|x|}{\sqrt{x^2+y^2}}\sup_{0\leq \sqrt{x^2+y^2}\leq r}\Big| \frac{\partial f}{\partial x}(s,t)\Big|\leq \sup_{0\leq \sqrt{x^2+y^2}\leq r}\Big| \frac{\partial f}{\partial x}(s,t)\Big|. $$ By continuity of $\frac{\partial f}{\partial x}$ at $(0,0)$, the rhs tends to $0$ when $(x,y)$ tends to $(0,0)$. This proves that the function $(x,y)\longmapsto \int_0^x \frac{\partial f}{\partial x}(s,y)ds$ is differentiable at $(0,0)$ with zero derivative. And this concludes the proof.

- 44,791
-
3Well this is one of the reason I think we shouldn't close questions like these ! – Kasper Apr 25 '13 at 22:52
-
@Kasper Yes, I'm glad the question was asked. I only knew that if all partial derivatives are continuous, then the function is differentiable. And I first looked for a counterexample...So I've learned something today again. – Julien Apr 25 '13 at 22:57
-
3Can this be generalized to $f: \mathbb {R} ^n \rightarrow \mathbb {R} $ with all but one continuous partial derivative? – Marco Flores Jan 26 '15 at 01:32
-
1
-
Shouldn't it be that one of the partial derivatives and the function is continuous at the required point? – Akshit Sep 23 '16 at 05:50
-
1@Akshit The 'counterexample' you mentioned is when both partial derivatives exist at the point $(x_0,y_o)$ and the one variable function $y\to f_x(x_0,y)$ is continuous. Is $f_x$ actually continuous at $(x_0,y_0)$? – Cyriac Antony Apr 09 '18 at 15:23
-
2@MarcoFlores the generalization to $f: \mathbb{R}^n \to \mathbb{R}$ can be found in Apostol (quoted here) – Guillaume F. Jul 16 '18 at 03:47
-
@MarcoFlores: And the theorem in Apostol only requires continuity of the $n-1$ partials at the point, not in a neighbourhood. – Hans Lundmark Aug 23 '18 at 14:28
I shall try to give a proof similar to the one given in Thomas Calculus for "If both partial derivatives exist and are continuous in a neighborhood of point $(x_0,y_0)$, then f is differentiable at that point"
Let $f_x$ and $f_y$ exist at $(x_0,y_0)$ and $f_x$ be continuous at $(x_0,y_0)$. It suffices to prove that $f(x_0+h,y_0+k)=f(x_0,y_0)+hf_x(x_0,y_0)+kf_y(x_0,y_0)+h\epsilon_1+k\epsilon_2$ where $\epsilon_1\to0$ and $\epsilon_2\to0$ as $(h,k)\to(0,0)$ for all $(x_0+h,y_0+k)$ in a neighborhood of $(x_0,y_0)$.
$f(x_0+h,y_0+k)-f(x_0,y_0)=f(x_0+h,y_0+k)-f(x_0,y_0+k)+f(x_0,y_0+k)-f(x_0,y_0)$.
Since $\frac{\partial f}{\partial y}$ exist at $(x_0,y_0)$, $$\frac{\partial f}{\partial y}(x_0,y_0)=\lim_{h\to0}\frac{f(x_0,y_0+k)-f(x_0,y_0)}{k}$$ and hence $$\lim_{k\to0}\frac{f(x_0,y_0+k)-f(x_0,y_0)-k\frac{\partial f}{\partial y}(x_0,y_0)}{k}=0$$ Defining $\epsilon_2$ as the fraction in the last equation, we get $f(x_0,y_0+k)-f(x_0,y_0)=k\frac{\partial f}{\partial y}(x_0,y_0)+k\epsilon_2$ where $\epsilon_2\to0$ as $k\to0$. As $\epsilon_2$ is a function of $k$ alone, it is clear that $\epsilon_2\to0$ as $(h,k)\to(0,0)$.
To handle the first difference, for a fixed $k$, we define $g(x)=f(x,y_0+k)$. Since $\frac{\partial f}{\partial x}$ exist and is continuous in a neighborhood of $(x_0,y_0)$, $g$ is differentiable and hence continuous in a neighborhood of the point. Moreover, the derivative $g'$ is $\frac{\partial f}{\partial x}$. By applying the mean value theorem to g in the interval with end points $x_0,x_0+h$, we obtain a point $c_h$ between $x_0$ and $x_0+h$ such that $g(x_0+h)-g(x_0)=g'(c_h)h$. That is, $f(x_0+h,y_0+k)-f(x_0,y_0+k)=h\frac{\partial f}{\partial x}(c_h,y_0+k)$. Now as both $h$ and $k$ approaches 0, $c_h\to x_0$ and $y_0+k\to y_0$. We claim that the difference (say) $\epsilon_1=\frac{\partial f}{\partial x}(c_h,y_0+k)-\frac{\partial f}{\partial x}(x_0,y_0)$ approaches 0 as $(h,k)\to(0,0)$.
To prove that $\epsilon_1\to0$ as $(h,k)\to(0,0)$ it suffices to show that $c_h\to x_0$ as $(h,k)\to(0,0)$ as $\frac{\partial f}{\partial x}$ is continuous at $(x_0,y_0)$ . Since $c_h\in(x_0,x_0+h)$ or $c_h\in(x_0+h,x_0)$, for any neighborhood $U$ of $y_0$, there exist a neighborhood $V$ of $(0,0)$ such that whenever $(h,k)\in U$, $c_h\in U$.
(I was not quite convinced that the double limit even exist. I asked it as a question and @Theoretical Economist gave this answer. It is his explanation that you find above).
Thus $f(x_0+h,y_0+k)-f(x_0,y_0+k)=h\frac{\partial f}{\partial x}(x_0,y_0)+h\epsilon_1$ where $\epsilon_1\to0$ as $(h,k)\to(0,0)$.
Thus we get $f(x_0+h,y_0+k)-f(x_0,y_0)=hf_x(x_0,y_0)+kf_y(x_0,y_0)+h\epsilon_1+k\epsilon_2$ where $\epsilon_1\to0$ and $\epsilon_2\to0$ as $(h,k)\to(0,0)$ in a suitable neighborhood of $(x_0,y_0)$.

- 552
-
u proved increment theorem; i think differentiability implies increment thm.is converse true? – MEET PATEL Aug 15 '22 at 13:08