0

After introducing the definition of "a function is non-decreasing at $x_{0}$ ":

The function is said to be non-decreasing at $x_{0}$ if for all $x$-values in some interval about $x_{0}$ it is true that when $x_{0}<x$ then $y_{0} \leq y$, and when $x_{0}>x$ then $y_{0} \geq y$.

Can we extend the intermediate value theorem to the following?

If $f$ is a continuous function on a closed interval $[a, b]$ with $f(a)<f(b)$, and if $y_{0}$ is any value strictly between $f(a)$ and $f(b)$, that is $f(a)<y_{0}<f(b)$, then $y_{0}=f(c)$ for some $c$ in $(a, b)$ and $\boldsymbol{f}$ is nondecreasing at the $c$.

It was wrong according to the book[1], but I am unable to read it through due to some notations and concepts like residual subset and first-category subset, they are new to me, I am just familiar with concepts in textbooks like Thomas Calculus and Introduction to Calculus and Analysis by Richard Courant and Fritz John. Is there a way of minimum prerequisites to understand "Functions Whose Graphs “Cross No Lines”"?

[1]. Elementary Real Analysis: Second Edition, Brian S. Thomson, Judith B. Bruckner, Andrew M. Bruckner, Section 13.14. The full book here http://classicalrealanalysis.info/documents/TBB-AllChapters-Landscape.pdf

iMath
  • 2,237
  • You cannot ensure $f$ is nondecreasing. For a simple counterexample just take $[a,b]=[0,1]$ and $f(x) = 1-x$. The function is strictly decreasing at every point in the open interval. – Arturo Magidin May 20 '22 at 02:03
  • @ArturoMagidin Thanks for the tip, I just corrected the post. – iMath May 20 '22 at 02:12
  • 1
    Still false, but less elementarily so. There are functions that are everywhere continuous and bounded, but nowhere monotone. See here. Then just pick appropriate $a$ and $b$. – Arturo Magidin May 20 '22 at 02:40

0 Answers0