25

I'm starting out university math and I'm struggling with understanding how to prove uniform continuity. I think I understand the concept of finding a $|x-x_0|<\delta$ for $|f(x)-f(x_0)|<\epsilon$ but all the examples I have found so far have been very vague in explaining how they relate to each other.

I have figured out that the smallest $\delta$ has something to do with the steepest part of the $f()$ function in such way that if $\delta$ satisfies $\epsilon$ in the steepest climb or descent, it will satisfy it everywhere else too.

But the problem I'm facing is that I don't always understand how I'm supposed to figure out the relation between these two variables.

I am able to solve an example like $f(x) = 5x+8$ like so: $x \geq 0, x=x_0+\delta, |f(x)-f(x_0)| = |5(x_0+\delta)+8-5x_0-8| = 5|\delta|$ and thus $5|\delta|<\epsilon$ so the solution is $\delta < \frac{\epsilon}{5}$. This seems easy and reasonable.

Here is an example that I can't crack: $f:[0,\infty[\rightarrow\mathbb{R}, f(x)=x^2$

So what I did first was define $x_0 \geq 0, x=x_0+\delta$

Then I wrote $|f(x)-f(x_0)|<\epsilon$ where $|f(x)-f(x_0)|$ is $|(x_0+\delta)^2 - x_0^2| = |x_0^2 + 2x_0\delta + \delta^2 - x_0^2| = |2x_0\delta + \delta^2|$

At this point many of the examples on the net are saying that I can break this in two parts, $|2x_0\delta| < \frac{\epsilon}{2}$ and $|\delta^2| < \frac{\epsilon}{2}$ and solve them separately. So I get $\delta < \sqrt{\frac{\epsilon}{2}}$ and $\delta < \frac{\epsilon}{4x_0}$

So what am I supposed to do with these two deltas I got? And why am I supposed to break it in parts? Shouldn't I get a single value for the $\delta$?

I understand that because $x^2$ grows at an increasing speed, no $\delta$ can satisfy all $\epsilon$ (and thus it's not uniformly continuous). But I don't know how I'm supposed to get there.

Also, if i confine the $f(x)=x^2$ to $f:[0,5]\rightarrow\mathbb{R}$, how can I then show that it's uniformly continuous?

Many of the documents i've found by googling "uniform continuity" seem to take shortcuts and I get lost.

If someone can explain this in a "layman way" clearly I would be very grateful!

supset
  • 251

3 Answers3

12

Here is another big cache of special cases. Suppose a function $f$ is differentiable an interval and that its derivative is bounded. If $$M = \sup |f'|,$$ then for any $x, y$ in the interval we have $$|f(x) - f(y)| \le M|x - y|.$$
So if you specify an $\epsilon > 0$, $\delta = \epsilon / M$ works for the definition of continuity.

More generally, if $f$ is continuous on a closed bounded interval, it is uniformly continuous there. This is a consequence of the Heine-Borel theorem.

ncmathsadist
  • 49,383
  • 2
    So this sounds like the thing I wrote in my post: the $\delta$ that applies to all $\epsilon$ is related to the supremum of $f$'s derivative. Unfortunately this definition does not explain how I can find the $\delta$ for some equation (like in my post). :( – supset Jun 08 '11 at 21:54
  • 2
    Observe that what you're actually proving is that a differentiable function with bounded derivative is Lipschitz-continuous (and that Lipschitz-continuity implies uniform continuity -- see http://en.wikipedia.org/wiki/Lipschitz_continuity ) – Bruno Stonek Jun 08 '11 at 22:09
10

Let's start with showing continuity for $f(x)=x^2$, and then trying to show uniform continuity.

With thanks to Theo Buehler, you want to find an expression for $\delta$ such that for all $h$ where $|h|<\delta$, $|f(x_0) - f(x_0+h)| < \epsilon$. This way $x_0+h$ stands for all values that are within $\delta$ of $x_0$.

What you are doing for $f(x) = x^2$ is a valid way of doing things: with $x_0 \geq 0$, you find that $|f(x_0) - f(x_0 + h)| = |2x_0h + h^2|$. The goal is to make this value less than epsilon.

To do this, we can use the triangle inequality: in general, $|a| + |b| \geq |a+b|$. Set $a := 2x_0h$ and $b := h^2$, and you'll have $|2x_0h + h^2| \leq |2x_0h| + |h^2|$.

If $|2x_0h| + |h^2| < \epsilon$, then $|2x_0h + h^2|$ is less than epsilon. A quick way to make this true is to set $|2x_0h| < \epsilon/2$ and $|h^2| = \delta^2 < \epsilon/2$. If both of parts of are less than $\epsilon/2$, then their sum will be less than $\epsilon$, and you can trace the argument back to say that $|f(x_0 + h) - f(x_0)|<\epsilon$.

Take the statements and rearrange them to get $|h| < \frac{\epsilon}{4|x_0|}$ and $|h| < \sqrt{\epsilon/2}$. To make both of these things true, let $|h|$ be less than the smaller of the two bounds, $|h| < \min\left(\frac{\epsilon}{4|x_0|},\sqrt{\epsilon/2}\right)$. Set the right hand side of the inequality to be your $\delta$. Then for $|h| < \delta$, $|f(x_0+h) - f(x_0)|<\epsilon$, which is the definition of (general, not uniform) continuity.

(If $x_0 = 0$, then $|2x_0h|=0$, so you only need to make $|h| < \sqrt{\epsilon/2}$.

To understand why this implies continuity but not uniform continuity, review your statement that $|f(x_0) - f(x_0 + h)| = |2x_0h + h^2|$. You want to find a delta such that the expression $|f(x_0) - f(x_0 + h)|$ is bounded by epsilon for all $x_0$ and $|h| < \delta$. But what happens as you make $x_0$ bigger? No matter what $\delta > 0$ you try, I can take $h = \delta/2$, and $x_0=2\frac{\epsilon}{\delta}-\frac{\delta}{4}$ to make $|f(x_0) - f(x_0 + \delta)| = |2x_0h + h^2| = 2\epsilon$, which exceeds the epsilon bound. In other words, the delta that was found in the above paragraph is dependent on $x_0$. If you make $x_0$ larger, then $|f(x_0) - f(x_0 + h)|$ will also tend to infinity.

In general, if you are proving (general or uniform) continuity from the definition, you are trying to manipulate inequalities to find $\delta$ in terms of $\epsilon$ and $x_0$. It can seem a bit counter-intuitive, but it gets easier with practice.

For your instance of restricting $f(x)=x^2$ to the set $[0,5]$, you can appeal to the Heine-Cantor theorem, which states that in metric spaces, continuous functions whose domain is a compact set are uniformly continuous.

You can also use the machinery that you created above. Given an $x_0 \neq 0$, $\delta = \min\left(\frac{\epsilon}{4|x_0|},\sqrt{\frac{\epsilon}{2}}\right)$, and for $x_0 = 0$, $\delta = \sqrt{\frac{\epsilon}{2}}$. On $[0,5]$, you can minimize $\frac{\epsilon}{4|x_0|}$ by maximizing $x_0$, so let $x_0 = 5$. Therefore you can find a delta that works for all $x_0 \in [0,5]$ by setting $\delta := \min \left(\frac{\epsilon}{20},\sqrt{\frac{\epsilon}{2}}\right)$.

t.b.
  • 78,116
Michael Chen
  • 4,191
  • Here's one thing that's bothering me a bit in your answer and the question. The reasoning is easy to make correct but it could be made much cleaner. In general, it doesn't suffice to just estimate $|f(x_0 + \delta) - f(x_0)|$ but rather you need to consider $|f(x_0 + h) - f(x_0)|$ for all $h$ with $|h| \lt \delta$ (and all $x_0$ in the case of uniform continuity). I'm sure this is clear to you, but I'm not sure that this really transpires from your answer. – t.b. Jun 09 '11 at 11:58
  • @Theo Buehler: Right, sorry. I have been accustomed to take that shortcut, but your way is better. I've reworked my answer to make that true. – Michael Chen Jun 09 '11 at 14:01
  • 1
    That's much nicer, thank you. This should be really helpful. – t.b. Jun 09 '11 at 14:09
5

Firstly, you must understand that uniform continuity, unlike continuity, is a global condition on the function on its domain. That is, given an $\epsilon>0$ , there exists a $\delta>0$ which only depends on that $\epsilon$ so for any pair of points $x$ and $y$ in the domain, if $|x-y|<\delta$ then $|f(x)-f(y)|<\varepsilon$. An equivalent way to look at this -- that brings this global-ness more to the front -- is to see that, if two sequences (need not be bounded or anything) in the domain: $\langle x_n \rangle$ and $\langle y_n \rangle$ are equivalent (that is, the limit of their difference tends to $0$), then so is $\langle f(x_n) \rangle$ and $\langle f(y_n) \rangle$. An advantage of invoking sequential characterizations of certain properties of functions, is that they are easier to prove than resorting to epsilonics.

Now consider $f(x)=x^2$ on $\mathbb R$, it's easy to see that $\langle n \rangle$ and $\langle n+\frac{1}{n} \rangle$ are quivalent. But, when we apply $f$, we get the sequences $\langle n^2 \rangle$ and $\langle n^2+\frac{1}{n^2}+2 \rangle$ and their difference tends to $2$, that is, they are not equivalent.

Also, remember it helps to understand certain facts such as:

  1. A continuous function on a compact interval is uniformly continuous.
  2. A continuous function thats lipschitz is uniformly continuous.
  3. If a function is uniformly continuous, it'll map a cauchy sequence to a cauchy sequence.
Dactyl
  • 2,834