The full proof can be found for example here, but I am interested in the case when
$ \tag{15} \lim_{x \to a} g(x) = + \infty$
Rudin says:
If $a < x < y < c$, then Theorem 5.9 shows that there is a point $t \in (x, y)$ such that $\tag{18} \frac{f(x) - f(y)}{g(x) - g(y)} = \frac{f'(t)}{g'(t)} < r$ ...Next, suppose (15) holds. Keeping $y$ fixed in (18), we can choose a point $c_1 \in (a, y)$ such that $g(x) > g(y)$ and $g(x) > 0$ if $a < x < c_1$
Why it is safe to assume that we can choose such a point $c_1$? How can such a point exist if $g(x)$ is monotonically increasing?
Thanks!