26

A function is "differentiable" if it has a derivative. A function is "continuous" if it has no sudden jumps in it.

Until today, I thought these were merely two equivalent definitions of the same concept. But I've read some stuff today which seems to be claiming that this is not the case.

The obvious next question is "why?" Apparently somebody has already asked:

Are Continuous Functions Always Differentiable?

Several answers were given, but I don't understand any of them. In particular, Wikipedia and one of the replies above both claim that $|x|$ has no derivative. Can anyone explain this extremely unexpected result?

Edit: Apparently some people dislike the fact that this is non-obvious to me. To be clear: I am not saying that the result is untrue. (I'm sure many great mathematicians have analysed the question very carefuly and are quite sure of the answer.) I am saying that it is extremely perplexing. (As a general rule, mathematics has a habit of doing that. Which is one of the reasons why we demand proof of everything.)

In particular, can anyone explain precisely why the derivative of $|x|$ at zero is not simply zero? After all, the function is neither increasing nor decreasing, which ought to mean the derivative is zero. Alternatively, the expression

$$\frac{|x + a| - |x - a|}{a}$$

becomes closer and closer to zero as $a$ becomes closer to zero when $x=0$. (In fact, it is exactly zero for all $a$!) Is that not how derivatives work?

Several answers have suggested that the derivative is not defined here "because there would be a jump in the derivative at that point". This seems to assert that a continuous function must never have a discontinuous derivative; I'm not convinced that this is the case. Can anyone confirm or refuse this argument?

  • 2
    What do you think the derivative of $|x|$ at $0$ is? The intuition is that "differentiable" means no sudden jumps, and no sharp corners. – mdp May 03 '12 at 15:09
  • 14
    Roughly speaking, a function is differentiable at a point if, the closer you zoom into that point, the more the function resembles a line (the slope of which is the derivative at that point). No matter how far you zoom in on $|x|$ at $x = 0$, it never resembles a line: it always looks like a corner. – Qiaochu Yuan May 03 '12 at 15:14
  • 1
    It turns out that in a well-defined sense, "most" continuous functions are nowhere differentiable. The intuition is not entirely reliable! That is a very strong argument for the detail with which one states and proves theorems. – André Nicolas May 03 '12 at 15:22
  • 17
    Here is some unasked-for advice: when several independent authorities agree on some fact that seems mistaken to me, it has often been my experience that the mistake is mine, not theirs. Then there is an opportunity to learn something new, which is not well-served by labeling the claim as "nonsensical". In my experience it has been more useful to ask "People say x, but this seems wrong to me because of y; what have I misunderstood?" Sometimes I reject the answer anyway, but usually it turns out that not all these people are dummies, and some of them understand better than I do. – MJD May 03 '12 at 15:29
  • 1
    @Qiaochu : I've always endeavored to give my students this intuition when teaching differentiability, but the concept of "zoom in" can be a bit subtle. When teaching with graphing calculators, you can easily get tripped up if you zoom in the "wrong way". For example, if you shrink the $x$ window but neglect to shrink the $y$ window, the absolute value function starts to look more and more like a straight (horizontal) line. More generally, what you see is sensitive to precisely how the horizontal and vertical rescaling is done. – Will Orrick May 03 '12 at 16:12
  • 1
    I think that mathematics is different than intuition. First of all, a function is continuous when it satisfies the definition of continuity. "Sudden jumps" may occur in the world of continuous functions! Moreover, a function is differentiable when it satisfies the definition of differentiability. The map $| \cdot |$ does not, and therefore it is not differentiable at $0$. I believe that "intuition" is highly overrated, when teaching analysis: have you ever spoken of intuition, when teaching/learning abstract algebra? – Siminore May 03 '12 at 16:27
  • 9
    @Mark Dominus's unasked for advice is excellent. As a professional mathematician reading the question, I had a visceral negative reaction to the word "nonsensical" which my conscious brain struggled to dampen. To paraphrase: something which doesn't (yet) make sense to you need not be nonsense. To assume -- even at the level of vocabulary -- otherwise is going to close a lot of doors that would otherwise have remained open. – Pete L. Clark May 03 '12 at 16:37
  • 5
  • 1
    To answer the question in your final paragraph, the problem with the "derivative" of $|x|$ is not that it has a jump at $0$, but that there's no sensible way of giving it a value at $0$ at all, so it's undefined at that point. It's the undefinedness, not the lack of continuity that's the problem. It turns out though (i.e. it's a theorem) that the derivative of any differentiable function is continuous. You might be interested in finding out about "weak derivatives" - $|x|$ does have a weak derivative, and it's what you expect (apart from that you can give it any value you like at $0$). – mdp May 04 '12 at 09:38
  • 3
    @Matt: It's not true that the derivative of any differentiable function is continuous. If it were, we wouldn't need the term "continuously differentiable". An example of a differentiable function with discontinuous derivative is $x^2\sin(1/x)$, whose derivative exists everywhere but is discontinuous at $x=0$ (see also here). A derivative need not even be Riemann-integrable; see e.g. Volterra's function. However, a differentiable function is continuous. – joriki May 04 '12 at 12:21
  • @joriki Thanks for pointing that out, I had forgotten that such things may happen. Clearly my analysis is rusty, I definitely knew that example back in the day! – mdp May 04 '12 at 12:25
  • And here I was thinking that an integral is the opposite of a derivative. Note to self: Never, ever play with calculus. o_O – MathematicalOrchid May 04 '12 at 12:26
  • 3
    Nobody dislikes that it is not obvious to you. But what you said was that it was "nonsensical", which does not mean "not obvious" or "extremely perplexing". – MJD May 04 '12 at 12:56

8 Answers8

36

Let's be clear: continuity and differentiability begin as a concept at a point. That is, we talk about a function being:

  1. Defined at a point $a$;
  2. Continuous at a point $a$;
  3. Differentiable at a point $a$;
  4. Continuously differentiable at a point $a$;
  5. Twice differentiable at a point $a$;
  6. Continuously twice differentiable at a point $a$;

and so on, until we get to "analytic at the point $a$" after infinitely many steps.

I'll concentrate on the first three and you can ignore the rest; I'm just putting it in a slightly larger context.

A function is defined at $a$ if it has a value at $a$. Not every function is defined everywhere: $f(x) = \frac{1}{x}$ is not defined at $0$, $g(x)=\sqrt{x}$ is not defined at negative numbers, etc. Before we can talk about how the function behaves at a point, we need the function to be defined at the point.

Now, let us say that the function is defined at $a$. The intuitive notion we want to refer to when we talk about the function being "continuous at $a$" is that the graph does not have any holes, breaks, or jumps at $a$. Now, this is intuitive, and as such it makes it very hard to actually check or test functions, especially when we don't have their graphs. So we need a definition that is mathematical, and that allows for testing and falsification. One such definition, apt for functions of real numbers, is:

We say that $f$ is continuous at $a$ if and only if three things happens:

  1. $f$ is defined at $a$; and
  2. $f$ has a limit as $x$ approaches $a$; and
  3. $\lim\limits_{x\to a}f(x) = f(a)$.

The first condition guarantees that there are no holes in the graph; the second condition guarantees that there are no jumps at $a$; and the third condition that there are no breaks (e.g., taking a horizontal line and shifting a single point one unit up would be what I call a "break").

Once we have this condition, we can actually test functions. It will turn out that everything we think should be "continuous at $a$" actually is according to this definition, but there are also functions that might seem like they ought not to be "continuous at $a$" under this definition but are. For example, the function $$f(x) = \left\{\begin{array}{ll} 0 & \text{if }x\text{ is a rational number,}\\ x & \text{if }x\text{ is not a rational number.} \end{array}\right.$$ turns out to be continuous at $a=0$ under the definition above, even though it has lots and lots of jumps and breaks. (In fact, it is continuous only at $0$, and nowhere else).

Well, too bad. The definition is clear, powerful, usable, and captures the notion of continuity, so we'll just have to let a few undesirables into the club if that's the price for having it.

We say a function is continuous (as opposed to "continuous at $a$") if it is continuous at every point where it is defined. We say a function is continuous everywhere if it is continuous at each and every point (in particular, it has to be defined everywhere). This is perhaps unfortunate terminology: for instance, $f(x) = \frac{1}{x}$ is not continuous at $0$ (it is not defined at $0$), but it is a continuous function (it is continuous at every point where it is defined), but not continuous everywhere (not continuous at $0$). Well, language is not always logical, we just learn to live with it (witness "flammable" and "inflammable", which mean the same thing).

Now, what about differentiability at $a$? We say a function is differentiable at $a$ if the graph has a well-defined tangent at the point $(a,f(a))$ that is not vertical. What is a tangent? A tangent is a line that affords the best possible linear approximation to the function, in such a way that the relative error goes to $0$. That's a mouthful, you can see this explained in more detail here and here. We exclude vertical tangents because the derivative is actually the slope of the tangent at the point, and vertical lines have no slope.

Turns out that, intuitively, in order for there to be a tangent at the point, we need the graph to have no holes, no jumps, no breaks, and no sharp corners or "vertical segments".

From that intuitive notion, it should be clear that in order to be differentiable at $a$ the function has to be continuous at $a$ (to satisfy the "no holes, no jumps, no breaks"), but it needs more than that. The example of $f(x) = |x|$ is a function that is continuous at $x=0$, but has a sharp corner there; that sharp corner means that you don't have a well-defined tangent at $x=0$. You might think the line $y=0$ is the tangent there, but it turns out that it does not satisfy the condition of being a good approximation to the function, so it's not actually the tangent. There is no tangent at $x=0$.

To formalize this we end up using limits: the function has a non-vertical tangent at the point $a$ if and only if $$\lim_{h\to 0}\frac{f(a+h)-f(a)}{h}\text{ exists}.$$ What this does is just saying "there is a line that affords the best linear approximation with a relative error going to $0$." Once you check, it turns out it does capture what we had above in the sense that every function that we think should be differentiable (have a nonvertical tangent) at $a$ will be differentiable under this definition. Again, turns out that it does open the door of the club for functions that might seem like they ought not to be differentiable but are. Again, that's the price of doing business.

A function is differentiable if it is differentiable at each point of its domain. It is differentiable everywhere if it is differentiable at every point (in particular, $f$ is defined at every point).

Because of the definitions, continuity is a prerequisite for differentiability, but it is not enough. A function may be continuous at $a$, but not differentiable at $a$.

In fact, functions can get very wild. In the late 19th century, it was shown that you can have functions that are continuous everywhere, but that do not have a derivative anywhere (they are "really spiky" functions).

Hope that helps a bit.


Added. You ask about $|x|$ and specifically, about considering $$\frac{|x+a|-|x-a|}{a}$$ as $a\to 0$.

I'll first note that you actually want to consider $$\frac{f(x+a)-f(x-a)}{2a}$$ rather than over $a$. To see this, consider the simple example of the function $y=x$, where we want the derivative to be $1$ at every point. If we consider the quotient you give, we get $2$ instead: $$\frac{f(x+a)-f(x-a)}{a} = \frac{(x+a)-(x-a)}{a} = \frac{2a}{a} = 2.$$ You really want to divide by $2a$, because that's the distance between the points $x+a$ and $x-a$.

The problem is that this is not always a good way of finding the tangent; if there is a well-defined tangent, then the difference $$\frac{f(x+a)-f(x-a)}{2a}$$ will give the correct answer. However, it turns out that there are situations where this gives you an answer, but not the right answer because there is no tangent.

Again: the tangent is defined to be the unique line, if one exists, in which the relative error goes to $0$. The only possible candidate for a tangent at $0$ for $f(x) = |x|$ is the line $y=0$, so the question is why this is not the tangent; the answer is that the relative error does not go to $0$. That is, the ratio between how big the error is if you use the line $y=0$ instead of the function (which is the value $|x|-0$) and the size of the input (how far we are from $0$, which is $x$) is always $1$ when $x\gt 0$, $$\frac{|x|-0}{x} = \frac{x}{x} = 1\quad\text{if }x\gt 0,$$ and is always $-1$ when $x\lt 0$: $$\frac{|x|-0}{x} = \frac{-x}{x} = -1\quad\text{if }x\lt 0.$$ That is: this line is not a good approximation to the graph of the function near $0$: even as you get closer and closer and closer to $0$, if you use $y=0$ as an approximation your error continues to be large relative to the input: it's not getting better and better relative to the size of the input. But the tangent is supposed to make the error get smaller and smaller relative to how far we are from $0$ as we get closer and closer to zero. That is, if we use the line $y=mx$, then it must be the case that $$\frac{f(x) - mx}{x}$$ approaches $0$ as $x$ approaches $0$ in order to say that $y=mx$ is "the tangent to the graph of $y=f(x)$ at $x=0$". This is not the case for any value of $m$ when $f(x)=|x|$, so $f(x)=|x|$ does not have a tangent at $0$. The "symmetric difference" that you are using is hiding the fact that the graph of $y=f(x)$ does not flatten out as we approach $0$, even though the line you are using is horizontal all the time. Geometrically, the graph does not get closer and closer to the line as you approach $0$: it's always a pretty bad error.

Arturo Magidin
  • 398,050
  • 2
    I know this is slightly pedantic, but you forgot that there exists $C^\infty$ functions that are not $C^\omega$. – kahen May 03 '12 at 17:42
  • 2
    @kahen: It's in the "and so on until we get to" (i.e.,. not so much forgot as didn't want to get into it). (-; – Arturo Magidin May 03 '12 at 17:51
  • 1
    I see. "Deliberately didn't mention" it is then. Fair enough. – kahen May 03 '12 at 18:10
  • 1
    Best answer yet. – MathematicalOrchid May 04 '12 at 08:57
  • I think it is worth pointing out that some authors allow a function to be differentiable at an accumulation point of the domain even when it is not defined there. – Michael Greinecker May 04 '12 at 09:15
  • @kahen: what's the difference between $C^\infty$ and $C^\omega$? What's the definition of the latter one? – Damian Sobota May 05 '12 at 12:25
  • An alternative way to think about the derivative of $f$ at $a$ is: "Is there a function $u_a(x)$, continuous at a, such that $f(x) \equiv u_a(x) (x - a) + f(a)$? If yes, then $f'(x) = u_a(a)$". Notice that $u_a(x)$ is uniquely defined everywhere except at $a$ where we want it to be continuous so we can pick a "nice" solution. It's easy to prove that continuity of $u_a$ implies continuity of $f$ so: every differentiable function is continuous. – WorldSEnder Oct 06 '18 at 02:12
  • is a "kink" behaviuor i.e like in the absolute value function the only obstruction we can have for a contious function that does not jump to be differentiable? or are there other behaviuors? – user123124 Apr 30 '20 at 09:46
  • @user1: Vertical tangents also provide obstructions: $y=x^{1/3}$ at $x=0$. – Arturo Magidin Apr 30 '20 at 15:10
  • right, if you are up for a challange ! try anwering this https://math.stackexchange.com/questions/3651609/all-the-ways-a-continuous-function-can-fail-to-be-differentiable/3651651#3651651 – user123124 May 01 '20 at 10:57
  • What is the difference between a jump and a break? – Alex Safayan Sep 12 '22 at 05:09
  • 1
    @Alex the paragraph after the definition describes what each of them are. – Arturo Magidin Sep 12 '22 at 12:50
12

The derivative, in simple words, is the slope of the function at the point. If you consider $|x|$ at $x > 0$ the slope is clearly $1$ since there $|x| = x$. Similarly, for $x<0$ the slope is $-1$. Thus, if you consider $x = 0$ then you cannot define the slope at that point, i.e. right and left directional derivatives do not agree at $x = 0$. So that's why the function is not differentiable at $x=0$.

Just to extend a perfect comment by Qiaochu to a more striking example, the sample path of a Brownian motion is continuous but nowhere differentiable: Brownian scaling

Note also that this curve exhibits self-similarity property, so if you zoom it, it looks the same and never will look any similar to a line. Also, the Brownian motion can be considered as a measure (even a probability distribution) on the space of continuous functions. The set of differentiable functions has this measure zero. So one can say that it is very unlikely that a continuous function is differentiable (I guess, that is what André meant in his comment).

SBF
  • 36,041
  • 1
    Your graph doesn't look to me like the usual Brownian motion: the variance of $y(x+h) - y(x)$ appears to be increasing as $x$ increases. – Robert Israel May 03 '12 at 17:44
  • @Robert: I changed the illustration, took it from Wikipedia's article on Wiener process. Hope that it shows that zooming does not lead us to a line. – SBF May 04 '12 at 09:49
  • 1
    So, to be clear, you're saying that all functions with infinite fine detail lack a derivative? – MathematicalOrchid May 04 '12 at 10:54
  • @MathematicalOrchid: to be clear, would you tell what do you mean with an infinite fine detail? – SBF May 04 '12 at 11:01
  • Touché. :-) OK, well how about this: Any function that possesses fractal self-similarity on all scales. – MathematicalOrchid May 04 '12 at 11:09
  • @MathematicalOrchid Straight line possess it on all scales :) I think, any other (continuous) curve would not be differentiable (but this is not a formal statement, of course) – SBF May 04 '12 at 11:50
  • 2
    Mathematics: Splitting hairs since 2,000 BC ;-) – MathematicalOrchid May 04 '12 at 11:59
7

The absolute value function has a derivative everywhere except at $x=0$. The reason there is no derivative at $x=0$ is that if the definition of the derivative is applied from the left, $$ \lim_{h\rightarrow0-}\frac{\lvert 0+h\rvert-\lvert0\rvert}{h}=-1, $$ you get a different answer than if it is applied from the right, $$ \lim_{h\rightarrow0+}\frac{\lvert 0+h\rvert-\lvert0\rvert}{h}=1. $$ Intuitively, the derivative is the slope of the tangent line, which changes abruptly at $x=0$. The graph of the derivative of $\lvert x\rvert$ would have a jump at $x=0$, and so would be discontinuous there. On the other hand, the absolute value function itself is continuous everywhere, including at $x=0$.

This example illustrates the fact that continuity does not imply differentiability. On the other had, differentiability does imply continuity. Intuitively, this is because, in order for the quotient $\dfrac{f(x+h)-f(x)}{h}$ to have a limit as $h\rightarrow0$, we must have $f(x+h)\rightarrow f(x)$ as $h\rightarrow0$, which is the limit definition of continuity.

Edit: To respond to your edit, you ask a good question! What you are seeing is that you can get different answers to the question "What is the derivative of $\lvert x\rvert$ at $x=0$?" depending on how you set up the difference quotient before taking the limit: the limit from the left gives $-1$, the limit from the right gives $1$ and a symmetrical quotient gives 0. You can get other answers as well. For example, if we position our small interval around $x=0$ asymmetrically, $$ \lim_{h\rightarrow0}\frac{\lvert0+\frac{2}{3}h\rvert-\lvert0-\frac{1}{3}h\rvert}{h} $$ we get $1/3$.

If, at a certain point, all of these methods give the same answer, we say that the function is differentiable at that point. In some sense, the definition of derivative is robust at such points, since we can make natural modifications to it and still get the same result. On the other hand, if, by fine-tuning the procedure, we can get different answers, then we say that the function is not differentiable at that point - the definition of derivative is not so robust there, since by making natural modifications, we can get different answers.

You can probably imagine that at points where we have that robustness, we can prove all sorts of strong statements about the behavior of the function. At points where we don't have it, we can't prove so much. Hence it makes sense to invent a term to capture this distinction between the two types of point.

Will Orrick
  • 18,220
5

You got many answers to the general question. I'd like to spend a few words on your quotient $$\frac{|x+a|-|x-a|}{a}.$$ Let's use a more conventional notation: $$\frac{|x+h|-|x-h|}{h}.$$ Now, it is pretty easy to prove that $$\lim_{h \to 0} \frac{|h|-|-h|}{h}=0.$$ You ask why this does not imply that $x \mapsto |x|$ is differentiable at $x=0$. The answer is, on one hand, simple: you did not use the correct definition of derivative :-)

On the other hand, "your" definition is used in mathematics, under different names. In general, we can consider $$\lim_{h \to 0} \frac{f(x_0+h)-f(x_0-h)}{2h}, \tag{1}$$ and this limit cooincides with $f'(x_0)$ provided that $f$ is differentiable at $x_0$. However, (1) may exist and yet $f$ is not differentiable at $x_0$. The limit (1) is often called symmetric derivative of $f$ at $x_0$.

Siminore
  • 35,136
2

Here is a function which can be written explicitly, in a simple form (i.e. not in terms of Brownian Motion). Take the following: $$f(x) = \sum_{n=0}^\infty \alpha^n \cos(\beta^n\pi x)$$ where $\alpha \in (0,1)$, $\alpha \beta \geq 1$. This is an example of a function which is continuous everywhere, but differentiable nowhere.

To prove continuity everywhere, the partial sums are continuous (being a finite sum of continuous functions). From this, prove that the series is uniformly convergent, and then you can prove that the function (being the limit of the partial sums) is the uniform limit of a sequence of continuous functions, and is hence continuous. (Use the Weierstrass-M test).

To prove that $f$ is nowhere differentiable is a bit more complicated, but to prove it, you could prove explicitly that the limit in the definition of the derivative does not exist, which is one of the most direct proofs. Another possible proof arises from Fourier Analysis, and roughly goes as follows. The function is expressed explicitly as a Fourier cosine series, which is uniformly convergent. Differentiate the partial sums termwise, and prove that the limit of the partial sums doesn't exist. There are details left out here, but this is another approach.

This function is actually an extension of the original construction by Weierstrass, and the desired properties of this function were established by G.H. Hardy.

One of the comments alluded to the fact that in some well-defined sense, almost every continuous function is nowhere differentiable. If we restrict ourselves to the case of functions which are continuous on the compact interval $[0,1]$, this is in the sense of (classical) Wiener measure, but is likely well beyond the scope of this question.

(See this. Another example of a continuous, but nowhere differentiable function is the Blancmange Function.)

There's another interesting example, but this one might be even harder to justify than the Weierstrass function. The Devil's Staircase (i.e. the Cantor-Lebesgue function) is a function which is continuous, but is not differentiable at any point in the Cantor set. Further, the derivative is zero wherever it is defined.

This function can actually be further generalized to create a strictly monotone continuous function whose derivative exists almost everywhere, and whose derivative is zero where defined. (The set of points where the derivative is not defined contains the Cantor set).

2

So, the answer to this question really depends on your notion of differentiability. Let us start with the classical notion of differentiability. A function, $f(x)$ is differentiable at a point, $x_0$ if the following limit exists:$lim_{h\rightarrow 0}\frac{f(x+h)-f(x)}{h}$. The other answers give a good explanation of why $|x|$ is not differentiable (in the classical way).

That said, their are other notions of differentiability that one may contemplate! For example the function that you mentioned $f(x)=|x|$, one may assign a derivative of 0 when $x=0$, if we use good generalization of derivative. One way to do this is to see that the derivative of $|x|$ is $+/-1$ depending on whether $x$ is greater than or less than zero. But what happens at zero. Well if you write this derivative function (that is undefined at 0 for the moment) as a Fouier series, then evaluate the series at $x=0$, you will get that the "derivative" that you obtain is zero (I am sure their are better ways of doing this such as approximating the +/- function by smooth functions).

That said, if we are speaking about non-classical notions of differentiability, one may even differentiate discontinuous functions in the distributional way. For instance the "second derivative" again non-classical of $|x|$ is the dirac delta distribution (which is no longer a real valued function but a certain type of limit of real valued functions). Also the derivative mentioned of $|x|$ is also of distributional type.

The wikipedia artical on distributional derivatives. http://en.wikipedia.org/wiki/Distribution_%28mathematics%29

Baby Dragon
  • 2,823
  • 22
  • 32
1

The basic concept of a derivative is slope. The derivative gives you the slope at any given point on the graph. So, as Matt hinted at, look at a graph that has a sharp point on it. What is the slope of a point? (There isn't one, it is undefined) Just because it is pointed does not mean that it is discontinuous, but it does mean that it is not differentiable everywhere. So to be differentiable, a function must be both smooth and continuous.

Hope that helps

Joseph Skelton
  • 1,159
  • 10
  • 14
1

There are two ways Two ways in which a continuous function can fail to be differentiable (assuming it is a function whose input and output are each a real number):

  • By having a vertical tangent, as in the case of $f(x) = \sqrt[3]{x}$ (the cube-root function), which has a vertical tangent at $x=0$.
  • By having a "sharp corner" in its graph, as in the case of $f(x)=|x|$, which has a sharp corner at $x=0$. At that point the slope abruptly changes from $-1$ to $+1$.
  • 1
    I think you mean "at least two ways". As written, your answer seems to suggest that every continuous function has a one-sided derivative in $[-\infty,\infty]$...which is certainly not true, as I'm sure you know. – Pete L. Clark May 04 '12 at 12:54
  • @PeteL.Clark : If one construes "sharp corner" somewhat broadly, I think that covers it. The "sawtooth" function that is nowhere differentiable and everywhere continuous has lots of sharp corners. – Michael Hardy May 04 '12 at 18:26
  • 1
    So you're saying that the function $f(x) = x \sin (1/x)$ for $x \neq 0$ for $x \neq 0$ and $f(0) = 0$ has a "sharp corner" at $x = 0$? If so, what's your definition of "sharp corner"? (I assumed you meant that the left and right handed derivatives both exist but are unequal, because that's what's happening in the example you give.) – Pete L. Clark May 04 '12 at 19:01
  • OK, I've rephrased it, since I'd rather not get into what the term "sharp corner" ought to mean. – Michael Hardy May 04 '12 at 21:44
  • Okay, then: +1. – Pete L. Clark May 04 '12 at 22:09