44

Exactly the title: can you take the derivative of a function at infinity?

I asked my maths teacher, and while she thought it was an original question, she didn't know the answer, and I couldn't find anything online about this.

Maybe this is just me completely misunderstanding derivatives and functions at infinity, but to me, a high schooler, it makes sense that you can. For example, I'd imagine that a function with a horizontal asymptote would have a derivative of zero at infinity.

Han
  • 569
  • 1
    Are you asking in the contexts of standard calculus or also in a more advanced context (e.g. complex analysis)? – user Apr 21 '18 at 17:25
  • 1
    @gimusi standard calculus - I've only just learnt about limits and derivatives, and we haven't even touched on the subject of complex numbers in class; so nothing very advanced – Han Apr 21 '18 at 17:30
  • 2
    Thanks, maybe you should clarify that in your OP since many users didn’t understand that and are giving answers based on more advanced topics. – user Apr 21 '18 at 17:49
  • 18
    @gimusi The question, as stated, is quite clear: a student, who has just learned about limits and derivatives, wants to know if a function can meaningfully be said to have a derivative at infinity. There is already a very good answer which addresses this question. Please don't ask the original poster to edit their question so that your answer more closely aligns with the question. It would be better to edit your answer so that it more accurately addresses the question. – Xander Henderson Apr 21 '18 at 17:52

1 Answers1

65

In a very natural sense, you can! If $\lim_{x \to \infty} f(x) = \lim_{x \to -\infty} f(x) = L$ is some real number, then it makes sense to define $f(\infty) = L$, where we identify $\infty$ and $-\infty$ in something called the one-point compactification of the real numbers (making it look like a circle).

In that case, $f'(\infty)$ can be defined as $$f'(\infty) = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).$$ When you learn something about analytic functions and Taylor series, it will be helpful to notice that this is the same as differentiating $f(1/x)$ at zero.

Notice that this is actually not the same as $\lim_{x \to \infty} f'(x)$.

These ideas actually show up quite a bit in analytic capacity, so this is a rather nice idea to have.


I wanted to expand this answer a bit to give some explanation about why this is the "correct" generalization of differentiation at infinity. and hopefully address some points raised in the comments.

Although $\lim_{x \to \infty} f'(x)$ might feel like the natural object to study, it is quite badly behaved. There are functions which decay very quickly to zero and have horizontal asymptotes, but where $f'$ is unbounded as we tend to infinity; consider something like $\sin(x^a) / x^b$ for various $a, b$. Furthermore, $\lim_{x \to \infty} f'(x) = 0$ is not sufficient to guarantee a horizontal asymptote, as $\sqrt{x}$ shows.

So why should we consider the definition I proposed above? Consider the natural change of variables interchanging zero and infinity*, swapping $x$ and $1/x$. Then if $g(x) := f(1/x)$ we have the relationship

$$\lim_{x \to 0} \frac{g(x) - g(0)}{x} = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).$$

That is to say, $g'(0) = f'(\infty)$. Now via this change of variables, neighborhoods of zero for $g$ correspond to neighborhoods of $\infty$ for $f$. So if we think of the derivative as a measure of local variation, we now have something that actually plays the correct role.

Finally, we can see from this that this definition of $f'(\infty)$ gives the coefficient $a_1$ in the Laurent series $\sum_{i \ge 0} a_i x^{-i}$ of $f$. Again, this corresponds to our idea of what the derivative really is.

* This is one of the reasons why I used the one-point compactification above. Otherwise, everything that follows must be a one-sided limit or a one-sided derivative.

  • 7
    If the downvoter would share the reason for their vote, I'd greatly appreciate it. If there are any improvements to be suggested, I would be happy to hear them. –  Apr 21 '18 at 18:21
  • 3
    Do you mind elaborating a bit for a tad more advance readers? $f(\infty)$ is hiding another limit, so a natural question is whether it is OK to do it first, at least if you now the outer limit converges? Also, do you really need to compactify the field? Seems the definition does not need it. +1 anyway. – kabanus Apr 21 '18 at 19:00
  • 1
    @kabanus Sure, and thanks. So the context I'm most familiar with this is in the Riemann sphere, where the compactification does make things a lot nicer. Strictly speaking, it's not necessary (and certainly not in $\mathbb{R}$) - you can define directionally based derivatives like $f'(+\infty)$ and $f'(-\infty)$ without identifying the endpoints. –  Apr 21 '18 at 19:02
  • @T.Bongers I downvoted since your answer do not adress to the OP which is asking for a calrification in the context of basic calculus and not in a more advanced context. – user Apr 21 '18 at 19:14
  • 2
    @gimusi For what it's worth, my answer doesn't actually require any advanced concepts beyond knowing what $\lim_{x \to \infty}$ means. In fact, if you strip out the part about compactifications, we can simply define $$f'(+\infty) = \lim_{x \to \infty) x \big(f(x) - \lim_{y \to \infty} f(y)\big)$$ and likewise for $f'(-\infty)$. This represents an exact analogue of the progression between $\lim_{x \to a}$ and $\lim_{x \to \infty}$: rather than looking at close-by points and studying the variation of a function, we zoom out and look at the long-term behavior. –  Apr 21 '18 at 19:17
  • @T.Bongers Your answer doesn't adress the main point of the OP that is "Can you differentiate a function at infinity?", you didn't give an answer to that point taking into account the high school context of the OP. – user Apr 21 '18 at 19:20
  • 8
    I don't get what one-point compactification is or Taylor series are, but aside from that, the answer is pretty understandable for me. – Han Apr 21 '18 at 19:27
  • 19
    @gimusi If a kindergartener asked "Can I subtract 5 from 3?", it would be very reasonable to answer "No. If you have three apples and try to give away five of them, you can't do that. Hence 5 can't be taken from 3." This would be an okay answer in the context of kindergarten mathematics. However, a better answer would be to point out that numbers can be generalized beyond cardinalities of sets, giving us negative numbers. This answer is the equivalent: it points out that the naive answer is "No," but that there is a very reasonable way of making things make sense. That answer is good. – Xander Henderson Apr 21 '18 at 19:31
  • 1
    @XanderHenderson I agree with you, indeed in this answer I can't see the first part that should be a "No we can't" and I only read the second part that is "In a very natural sense, you can!". For this reason I think that it is an uncomplete answer. – user Apr 21 '18 at 19:35
  • 3
    Why is $f'(1/x)$ at zero more natural than $\lim_{x\to\infty} f'(x)$ to be the expression for "derivative at infinity"? This seems counter-intuitive. – Ruslan Apr 21 '18 at 21:05
  • 3
    Even when at one point (say $0$), $f'(0)$ is not the same as $\lim_{x\to 0} f'(x)$. The latter one might not exists even when $f'(0)$ exists. @Ruslan –  Apr 21 '18 at 21:41
  • 2
    In what sense does this generalize the concept of a derivative? I can see how you could decompose a function into a series of $1/x^n$, but you could decompose it into a whole lot of other series too, like $e^{-nx}$. Why is this one most analogous to a derivative? – Owen Apr 21 '18 at 23:41
  • 3
    @Owen It likely requires a little bit of topology to make this rigorous, but the basic idea of the derivative is that you "zoom in" on a point in the domain of a function and analyze the behaviour of the function as you zoom in. A function is differentiable at that point if, as you zoom in, you start to see a straight line. In the language of topology, we are looking at arbitrary neighborhoods of points. So then the question becomes, "What does a neighborhood of infinity look like?" The most general model is the Alexandrov (or one point) compactification. (cont below) – Xander Henderson Apr 22 '18 at 00:11
  • 2
    (cont from above) In this model, "neighborhoods" of infinity look like $\mathbb{R} \setminus [-a,a]$, where $a\in\mathbb{R}$, $a> 0$. That is, a neighborhood of infinity is the complement of a neighborhood of zero (more or less). Here $f'(\infty)$ (i.e. the derivative at infinity) is equivalent to $f'(1/x)$. We could also work with the extended real numbers (a two-point compactification, if you will), where you have two derivatives at infinity: $f'(-\infty) = f'(1/x^-)$ and $f'(+\infty) = f'(1/x^+)$, where $x^-$ and $x^+$ are limits from the left and right, respectively. – Xander Henderson Apr 22 '18 at 00:11
  • 6
    See the OP's last sentence, "a function with a horizontal asymptote would have a derivative of zero at infinity". This answer's definition doesn't agree with that even in the simplest case, $f(x) = 1/x$ . $\lim_{x\to\infty} f'(x) = 0$ , but $\lim_{x\to\infty} x(f(x)-f(\infty)) = 1$ . – mr_e_man Apr 22 '18 at 04:22
  • 3
    @mr_e_man You're right, this definition of the derivative is not consistent with the asker's intuition, because that intuitive idea turns out not to be the typical generalization that is actually used. –  Apr 22 '18 at 04:26
  • 8
    As someone unfamiliar with analytic capacity, I find this answer completely unilluminating. There is a big gap between the first paragraph, which gives a quite reasonable interpretation of $f(\infty)$, and the second, where the formula for $f'(\infty)$ is simply asserted with no explanation. Sure, it makes sense that a derivative should have a term of the $f(x)-f(\infty)$ in it, but why is it multiplied by $x$? Why should we interpret it as differentiating $f(1/x)$ at $x=0$, when $f'(y)$ is not the same as differentiating $f(1/x)$ at $x=1/y$ for finite $y$? –  Apr 22 '18 at 09:12
  • 4
    Sure, I get that it's a standard definition that's used in some specialized areas, but if you're going to introduce it to a student, surely you need to explain how it arises and why it deserves to be interpreted as a generalization of the traditional derivative. –  Apr 22 '18 at 09:15
  • 7
    It's not clear why you would choose the one-point compactification over the two-points compactification $[-\infty, +\infty]$. I don't see why you would want to exclusively consider functions that exhibit the same behavior around $+\infty$ and $-\infty$. – Najib Idrissi Apr 22 '18 at 09:46
  • @NajibIdrissi Do you think that this answer adresses the OP? – user Apr 22 '18 at 10:13
  • 5
    @gimusi ...yes? – Najib Idrissi Apr 22 '18 at 11:23
  • 2
    I'm having trouble with the notation. For any function $f$ where the following limit is defined, you define $f(\infty)$ as the limit of the function as its argument approaches $\infty$. $f'$ is a function where that limit may be defined. You then define $f'(\infty)$ as something other than what you just defined it to mean. Am I misinterpreting your answer? Can you clarify where I'm going wrong? – hvd Apr 22 '18 at 11:35
  • 3
    @hvd the same notation is used for two different things; that's not ideal but not uncommon. Generally it is better to think in terms of "the derivative of $f$ at some point $a$" as being an object in its own right as opposed to think if it as the value of some derivative-function $f'$ at $a$. – quid Apr 22 '18 at 12:39
  • @T.Bongers this is a great answer and I'm not the downvoter but the only reason I can see why somebody might downvote this is that (even though you've dumbed it down skilfully) it may still be too technical for some people in the audience. I think a linked picture of the extended real line would do wonders to help a school mathematician understand what you mean. – it's a hire car baby Apr 22 '18 at 13:09
  • I will add that a question about the definition of definition analytic capacity and $f'(\infty)$ can be found here: https://math.stackexchange.com/q/432298 – Martin Sleziak Apr 22 '18 at 16:37
  • @NajibIdrissi You're right, the one-point compactification is really unnecessary here. As I said in some other comment, I'm most familiar with this in the context of the Riemann sphere where it really is far more natural to require a single derivative at infinity, rather than a bunch of directional derivatives. So I'm used to thinking of the single point at infinity with no direction connected to it. The other advantage is that by using a one-point compactification rather than two points, we can identify $f'(\infty)$ with $\frac{d}{dx} f(1/x)$ at zero - a two-sided derivative, not one-sided. –  Apr 22 '18 at 17:43
  • 1
    directly after calculus, many identify periodic functions with functions $S^1 \to \mathbb C$ (or to $\mathbb R$.) This makes sense to me, but here, there is something a bit odd to me about not requiring that the $\lim_{x \to -\infty} f(x)$ is not required to be the same as $\lim_{x \to \infty}f(x)$. Another thing strange to me is that in complex analysis it is perfectly natural to speak of the derivative at $\infty$ when $\lim_{z \to \infty} f(z)=\infty$, so the requirement that the limit converges here also feels unnatural to me. Could you help me out of this dilemma? – Andres Mejia Apr 22 '18 at 18:09
  • @quid I appreciate the clarification but it still leaves me confused. Without taking $f'(x)$ as evaluating some derivative-function $f'$, I wouldn't even know where to begin to make sense of e.g. $f''(x)$, and with the definitions in this answer, it's unclear to me what $f''(\infty)$ should mean. Or do second derivatives not come up in this context? – hvd Apr 22 '18 at 21:45
  • @hvd once you have established what "the derivative of $f$ at some point $a$" means you can introduce a function that assigns to each $a$ "the derivative of $f$ at $a$" and then consider this derivative-function. The expanded version of the answer should clarify things. – quid Apr 22 '18 at 22:45
  • @quid The definition of $f''(\infty)$ involves $f'(\infty)$, but I suspect in that particular case, it does use the first definition ($\lim_{x \to \infty} f'(x)$) rather than the second one. I still think this answer would be improved by not using the same notation to mean two different things, even if both are used in practice. – hvd Apr 23 '18 at 05:04
  • Why use $1/x$ as change of variables? Why not $1/x^2$ or $-log(x)$? Just because this definition is useful in one context (complex analysis) doesn't mean it's the appropriate definition in others, even less for so for a high-school student – Bananach May 01 '18 at 16:52
  • This is interesting. As suggested in the edit, consider $f(x)=\sin(x^4)/x^2$ . This has a horizontal asymptote, $f'(x)$ is unbounded, and $\lim_{x\to\infty} x(f(x)-0)=0$ . But the function $g'(x)=\frac{d}{dx}f(1/x)$ is unbounded as well! In particular, $\lim_{x\to 0} g'(x)$ does not exist; it oscillates everywhere. So $g'$, though defined, is discontinuous at this point. It seems that $\lim_0 g'$ is as "badly behaved" as $\lim_\infty f'$, at least in this case. – mr_e_man May 02 '18 at 02:17
  • With $f(x) = \sin(x^a)/x^b$ (both parameters positive) and $g(x) = f(1/x)$, I find that $\lim_\infty f'=0$ iff $a < b+1$, and $\lim_0 g'=0$ iff $a < b-1$ . (They oscillate otherwise.) Note that $b-1 < b+1$, so if $g'$ has a limit, then $f'$ does also. But there are some cases, such as $a=b$, when $f'$ has a limit and $g'$ does not. – mr_e_man May 02 '18 at 07:43
  • It is not possible to shift complex concepts to real analysis without losing some precision. The answer is somewhat if not completely artificial. Complex derivative at infinity has its precise meaning. Passing it to real world does not keep this meaning at all. Derivatives of complex function and real function have different nature. If complex function is to become real, we define all its values to be real, not simply by reducing its domain to real numbers. And it is kind of obvious that people think that asymptotic derivative and derivative at infinity are related. They are not. –  May 02 '18 at 08:55
  • I just happened to read this post. The answer seems reasonable, but how to interpret the derivative at infinity by rate of change? For example, $f(x)=1/x$, $f'(\infty)=1$, what rate has limit equal to $1$? If $f'(\infty)>0$, then the function is decreasing for sufficiently large $x>0$, correct? – Haoran Chen Mar 14 '23 at 10:21
  • Can you express this using Epsilon notation? I want to formalize the notation. – fatFeather Jan 22 '24 at 08:53