35

In calculus of one variable I read:

A function $f$ is differentiable on an open interval if it is differentiable at every number of the interval.

I wonder why in the definition we suppose that the interval is open. This is the case in Rolle theorem, Mean value theorem,etc.

Do we have a notion of a function being differentiable on a closed interval $[a,b]$?

Ted Shifrin
  • 115,160
palio
  • 11,064

3 Answers3

55

The problem is one of consistent definitions. Intuitively we can make sense of differentiable on a closed interval: but it requires a slightly more careful phrasing of the definition of "differentiable at a point". I don't know which book you are using, but I am betting that it contains (some version of the) following (naive) definition:

Definition A function $f$ is differentiable at $x$ if $\lim_{y\to x} \frac{f(y) - f(x)}{y-x}$ exists and is finite.

To make sense of the limit, often times the textbook will explicitly require that $f$ be defined on an open interval containing $x$. And if the definition of differentiability at a point requires $f$ to be defined on an open interval of the point, the definition of differentiability on a set can only be stated for sets for which every point is contained in an open interval. To illustrate, consider a function $f$ defined only on $[0,1]$. Now you try to determine whether $f$ is differentiable at $0$ by naively applying the above definition. But since $f(y)$ is undefined if $y<0$, the limit

$$ \lim_{y\to 0^-} \frac{f(y) - f(0)}{y} $$

is undefined, and hence the derivative cannot exist at $0$ using one particular reading of the above definition.

For this purposes some people use the notion of semi-derivatives or one-sided derivatives when dealing with boundary points. Other people just make the convention that when speaking of closed intervals, on the boundary the derivative is necessarily defined using a one-sided limit.


Your textbook is not just being pedantic, however. If one wishes to study multivariable calculus, the definition of differentiability which requires taking limits in all directions is much more robust, compared to one-sided limits: the main problem being that in one dimension, given a boundary point, there is clearly a "left" and a "right", and each occupies "half" the available directions. This is no longer the case for domains in higher dimensions. Consider the domain

$$ \Omega = \{ y \leq \sqrt{|x|} \} \subsetneq \mathbb{R}^2$$

$\hspace{5cm}$enter image description here

A particular boundary point of $\Omega$ is the origin. However, from the origin, almost all directions point to inside $\Omega$ (the only one that doesn't is the one that points straight up, in the positive $y$ direction). So the total derivative cannot be defined at the origin if a function $f$ is only defined on $\Omega$. But if you try to loosen the definitions and allow to consider only those "defined" directional derivatives, they may not patch together nicely at all. (A canonical example is the function $$f(x,y) = \begin{cases} 0 & y \leq 0 \\ \text{sgn}(x) y^{3/2} & y > 0\end{cases}$$ where $\text{sgn}$ return $+1$ if $x > 0$, $-1$ if $x < 0$, and $0$ if $x = 0$. Its graph looks like what happens when you tear a piece of paper.)

$\hspace{4cm}$enter image description here


But note that this is mainly a failure of the original naive definition of differentiability (which, however, may be pedagogically more convenient). A much more general notion of differentiability can be defined:

Definition Let $S\subseteq \mathbb{R}$, and $f$ a $\mathbb{R}$-valued function defined over $S$. Let $x\in S$ be a limit point of $S$. Then we say that $f$ is differentiable at $x$ if there exists a linear function $L$ such that for every sequence of points $x_n\in S$ different from $x$ but converging to $x$, we have that $$ \lim_{n\to\infty} \frac{f(x_n) - f(x) - L(x_n-x)}{|x_n - x|} = 0 $$

This definition is a mouthful (and rather hard to teach in an introductory calculus course), but it has several advantages:

  1. It readily includes the case of the closed intervals.
  2. It doesn't even need intervals. For example, you can let $S$ be the set $\{0\} \cup \{1/n\}$ where $n$ range over all positive integers. Then $0$ is a limit point, and so you can consider whether a function defined on this set is differentiable at the origin.
  3. It easily generalises to higher dimensions, and vector valued functions. Just let $f$ take values in $\mathbb{R}^n$, and let the domain $S\subseteq \mathbb{R}^d$. The rest of the definition remains unchanged.
  4. It captures, geometrically, the essence of the differentiation, which is "approximation by tangent planes".

For this definition, you can easily add

Definition If $S\subseteq \mathbb{R}$ is such that every point $x\in S$ is a limit point of $S$, and $f$ is a real valued function on $S$, we say that $f$ is differentiable on $S$ if $f$ is differentiable at all points $x\in S$.

Note how this looks very much like the statement you quoted in your question. In the definition of pointwise differentiability we replaced the condition "$x$ is contained in an open neighborhood" by "$x$ is a limit point". And in the definition of differentiability on a set we just replaced the condition "every point has an open neighborhood" by "every point is a limit point". (This is what I meant by consistency: however you define pointwise differentiability necessarily effect how you define set differentiability.)


If you go on to study differential geometry, this issue manifests behind the definitions for "manifolds", "manifolds with boundaries", and "manifolds with corners".

Willie Wong
  • 73,139
  • 2
    I don't know if it it right place to ask, but is there a way to bookmark an answer? – Prince Kumar Dec 06 '16 at 15:53
  • 1
    There is a link (if you right click on "share" and copy the link address) at the bottom of the post. That is a direct link to this answer. // For alternatives, it is better to ask on our [[meta]]. – Willie Wong Dec 06 '16 at 19:25
  • I‘m a little bit confused with the definition which uses a linear function, and have an absolute value bar at the bottom. I posted my question here(https://math.stackexchange.com/questions/4448552/derivative-at-a-boundary-or-limit-point-heine-definition-of-limit), can someone help me clarify this? Thank you. – wsz_fantasy May 12 '22 at 11:44
7

A function is differentiable on a set $S$, if it is differentiable at every point of $S$. This is the definition that I seen in the beginning/classic calculus texts, and this mirrors the definition of continuity on a set.

So $S$ could be an open interval, closed interval, a finite set, in fact, it could be any set you want.

So yes, we do have a notion of a function being differentiable on a closed interval.

The reason Rolle's theorem talks about differentiabilty on the open interval $(a,b)$ is that it is a weaker assumption than requiring differentiability on $[a,b]$.

Normally, theorems might try to make the assumptions as weak as possible, to be more generally applicable.

For instance, the function:

$$f(x) = x \sin \frac{1}{x}, x \gt 0$$ $$f(0) = 0$$

is continuous at $0$, and differentiable everywhere except at $0$.

You can still apply Rolle's theorem to this function on say the interval $(0,\frac{1}{\pi})$. If the statement of Rolle's theorem required the use of the closed interval, then you could not apply it to this function.

Aryabhata
  • 82,206
  • 2
    How do you define differentiable? Generally the first definition involves taking the $\lim$ from both left and right, and if you are on the boundary, you are missing one side. – Willie Wong Mar 30 '12 at 07:52
  • 1
    @WillieWong: You can define differentiable by taking the intersection of the neighbourhood with the domain. So if the domain is $[a,b]$, you get the one sided limit. What is the problem with that? Are you saying, I am potentially answering the wrong question? It is possible. – Aryabhata Mar 30 '12 at 07:55
  • The problem with that is that this notion doesn't generalize well. In higher dimensions with domains with non-Lipschitz boundaries, for example, you can run into difficulties. And more fundamentally this is precisely the different between a manifold and a manifold-with-boundary. – Willie Wong Mar 30 '12 at 08:06
  • @WillieWong: I presume the problem OP has is more basic. I do understand that this might not generalize. I do hope that you will add an answer with the generalization you have in mind! – Aryabhata Mar 30 '12 at 08:08
  • so why we have a notion of continuous function on a closed interval, that is $f$ is continuous on $[a,b]$ if $f$ is continuous on $(a,b)$ and $\lim_{x\to a^+}f(x)=f(a)$ and $\lim_{x\to b^-}f(x)=f(b)$ what prevents us from defining similar notion of differentiability on the endpoints? – palio Mar 30 '12 at 08:11
  • @palio: Nothing prevents us, and that is what the answer says (or were you addressing the comment at Willie?). Willie's objections are for the more general cases. – Aryabhata Mar 30 '12 at 08:15
  • Ok so authors define differentiability on open intervals just for generalization considerations. – palio Mar 30 '12 at 08:22
  • 1
    @palio: As I said, you can always define differentiability over an arbitrary set. It is only that most theorems don't require differentiability at the end points. That is the reason the authors don't consider the closed interval when stating those theorems. Of course, the reason could be more fundamental, I suggest you read Willie's answer. – Aryabhata Mar 30 '12 at 08:23
  • "you can always define differentiability over an arbitrary set" not true. At the very least, for the definition of "derivative" to make sense, you need to be able to take limits. In other words, if $S$ is a subset of $\mathbb{R}$, you can only define the derivative at $S\cap \mathop{der}(S)$ (where $\mathop{der}$ is the derived set). Otherwise consider the one-point set $S = {0} = [0,0]$. There's no way to make sense of the notion of a derivative for a function defined just on $S$. – Willie Wong Mar 30 '12 at 08:34
  • @WillieWong: I am talking about the definition of differentiability over a set. You can impose the requirement you talk about, when talking about the derivative at a point. For a given set, as part of the definition, when we talk about differentiability at each point, and your restrictions pop up. I am not claiming that you can have an arbitrary domain for the function and then talk about limits etc. (btw, you could, but not sure if it would be of any use). It is just a matter of definition, for instance, we talk of $C^{1}[a,b]$ etc. – Aryabhata Mar 30 '12 at 08:55
  • We also talk about functions being differentiable at rationals, or finding functions which are differentiable on a given set of measure $0$ etc. By talking about differentiability on a set, we do not automatically restrict the domain... Of course, as I said earlier, depends on how you define differentiability on a set. – Aryabhata Mar 30 '12 at 09:02
  • 2
    $f\in C^1(F)$ where $F$ is closed almost always means that $f\in C^1(\mathrm{int}(F))$ and the derivatives extend continuously up to boundary. See this mathoverflow question for a funny example because of this convention. My point (see also the third part of my answer) is that how "differentiability over a set" can be defined is intimately tied to how "differentiability at a point" is defined. It only makes sense to ask "is $f$ differentiable on $S$" if a priori it is possible to say for each point of $S$ if $f$ is differentiable. – Willie Wong Mar 30 '12 at 09:03
  • @WillieWong: I am following the simple definition: A function is differentiable on a set if it is differentiable at every point of the set: exactly the way we define continuity on a set. Perhaps it is outdated (it is certainly not abnormal), but I think OP's question pertains to that, rather than some deeper problem. btw, I did read your answer, and you already have my upvote! – Aryabhata Mar 30 '12 at 09:06
  • @Arya: "A function is differentiable on a set if it is differentiable at every point of the set" -agreed. But by my reading of the OP's question, this sentence is somewhat "passing the buck" :) I see I've been reading your answer with my interpretation of the question in mind, which is probably why I had this nagging feeling. Under your (implied) interpretation, I agree that your answer is perfectly reasonable. – Willie Wong Mar 30 '12 at 09:15
  • @WillieWong: I agree, that is why I had hoped you would add an answer :-) – Aryabhata Mar 30 '12 at 09:18
3

Nice question, Palio, I was searching the net for the same question and it's answer, here is what I understand, take example of derivative, derivative by definition is $\frac{f(x+h)-f(x)}{h}$, here $h$ can be positive or negative, for the term $f(x+h)$ to make sense or be valid, the function $f$ must be defined "around" the point $x$, this mathematically is defining a function on an open interval.

Vikram
  • 5,580
  • Don't know if this helped Palio but I found this to be the simplest explanation that makes sense. – curryage Mar 29 '14 at 09:40
  • 1
    @curryage, thanx :) – Vikram Mar 29 '14 at 10:48
  • 1
    I was also thinking of the same, but then why can continuity be defined on [a,b]? For continuity as well, we take the limit tending to x, right? So we should define the function 'around' x in this case? – PGupta Oct 17 '18 at 10:23
  • @PGupta, continuity of a function F: R-->R, defined as an everywhere differentiable function with closed intervals depends on "one-sided limits" like Willie Wong has described. A different approach will be needed to determine continuity if it is stated that the real line is not connected, however. – ten1o Feb 06 '21 at 13:10