1

The full proof can be found for example here, but I am interested in the case when

$ \tag{15} \lim_{x \to a} g(x) = + \infty$

Rudin says:

If $a < x < y < c$, then Theorem 5.9 shows that there is a point $t \in (x, y)$ such that $\tag{18} \frac{f(x) - f(y)}{g(x) - g(y)} = \frac{f'(t)}{g'(t)} < r$ ...Next, suppose (15) holds. Keeping $y$ fixed in (18), we can choose a point $c_1 \in (a, y)$ such that $g(x) > g(y)$ and $g(x) > 0$ if $a < x < c_1$

Why it is safe to assume that we can choose such a point $c_1$? How can such a point exist if $g(x)$ is monotonically increasing?

Thanks!

Thomas Andrews
  • 177,126
S11n
  • 898

1 Answers1

1

$\lim\limits_{x \to a} g(x) = \infty$ means that for every $M \in \mathbb{R}$, there is an $\epsilon > 0$ such that if $|x-a| < \epsilon$, then $g(x) > M$. In your instance choose $M := \max\{0,g(y)\}$ and $c_1 := a + \epsilon$.

As for your second question: $g$ cannot be monotonically increasing if $\lim\limits_{x \to a} g(x) = \infty$. Note that the singularity is to the left of $x$ and $y$.

Klaus
  • 10,578
  • Yes, I missed that point that this is for the case when $g(x)$ is decreasing, and an analogous proof should be taken if $g(x)$ is increasing. Thanks! – S11n Sep 14 '22 at 13:07