59

It's very basic but I'm having trouble to find a way to prove this inequality

$\log(x)<x$

when $x>1$

($\log(x)$ is the natural logarithm)

I can think about the two graphs but I can't find another way to prove it, and, besides that, I don't understand why should it not hold if $x<1$

Can anyone help me?

Thanks in advance.

J. W. Tanner
  • 60,406
Gianolepo
  • 2,507
  • 2
  • 21
  • 38
  • 4
    This depends on your definition of logarithm. Anyway, $\log x < x$ is true for all $x$. – Crostul Dec 26 '15 at 12:25
  • https://math.stackexchange.com/questions/652581/showing-fracx1x-log1xx-for-all-x0-using-the-mean-value-theorem – Guy Fsone Feb 09 '18 at 14:07
  • https://math.stackexchange.com/questions/324345/intuition-behind-logarithm-inequality-1-frac1x-leq-log-x-leq-x-1 – Guy Fsone Feb 09 '18 at 14:08
  • Related: https://math.stackexchange.com/questions/741600/prove-that-logx-x-for-x-0-x-in-mathbbn https://math.stackexchange.com/questions/1396776/prove-log1xx-for-x0 https://math.stackexchange.com/questions/380963/prove-that-log-x-x-for-all-x-0 – Frank Aug 29 '19 at 00:36

14 Answers14

48

I thought it might be instructive to present a proof that relies on standard tools only. We begin with the limit definition of the exponential function

$$e^x=\lim_{n\to \infty}\left(1+\frac xn\right)^n$$

It is easy to show that the sequence $e_n(x)=\left(1+\frac xn\right)^n$ increases monotonically for $x>-1$. To show this we simply analyze the ratio

$$\begin{align} \frac{e_{n+1}(x)}{e_n(x)}&=\frac{\left(1+\frac x{n+1}\right)^{n+1}}{\left(1+\frac xn\right)^n}\\\\ &=\left(1+\frac{-x}{(n+x)(n+1)}\right)^{n+1}\left(1+\frac xn\right) \tag 1\\\\ &\ge \left(1+\frac{-x}{n+x}\right)\left(1+\frac xn\right)\tag 2\\\\ &=1 \end{align}$$

where in going from $(1)$ to $(2)$ we used Bernoulli's Inequality. Note that $(2)$ is valid whenever $n>-x$ or $x>-n$.

Since $e_n(x)$ monotonically increases and is bounded above by $e^x$, then

$$e^x\ge \left(1+\frac xn\right)^n \tag 3$$

for all $n\ge 1$. And therefore, for $x>-1$ we have

$$e^x\ge 1+x \tag 4$$

Since $e^x>0$ for all $x$, then $(4)$ is true for $x\le -1$ also. Therefore, $e^x\ge 1+x$ for all $x$.

ASIDE:

From $(4)$ we note that $e^{-x}\ge 1-x$. If $x<1$, then since $e^x\,e^{-x}=1$, $e^x\le \frac{1}{1-x}$. Thus, for $x<1$ we can write

$$1+x\le e^x\le \frac{1}{1-x}$$

Taking the logarithm of both sides of $(4)$ produces the coveted inequality

$$\log(1+x)\le x \tag 5$$

Interestingly, setting $x=-z/(z+1)$ into $(4)$ reveals

$$\log(1+z)\ge \frac{z}{z+1}$$

for $z>-1$. Putting it all together we have for $x>0$

$$\frac{x-1}{x}\le \log x\le x-1<x$$

Mark Viola
  • 179,405
  • Hi Mark, There is a resent posting that links this answer of yours. I took the liberty to comment on that as a wiki-answer. I hope I made justice to your arguments here (and in a follow-up link) – Mittens Jul 31 '21 at 13:15
  • Hi Mark, to prove the limit exists we need $e_n(x)$ monotonically increasing and bounded. The proof that $e_n(x)$ is monotonically increasing looks good to me, but how did you get $e_n(x)$ is bounded above by $e^x$ if we haven't yet proved $e^x$ exists? (This is the goal - prove $e^x$ exists.) – EthanAlvaree Aug 17 '21 at 12:23
  • 2
    @ethanalvaree Hi Ethan. It is easy to show that the limit is bounded above. Use the binomial theorem and apply simple estimates to show that the limit superior is bounded by the series for $e^x$. Then apply the ratio test. – Mark Viola Aug 17 '21 at 12:35
  • Hi Mark, thank you, I was able to show $\left(1+\frac{x}{n}\right) = \sum_{k=0}^n \binom{n}{k} (1)^{n-k}\left(\frac{x}{n}\right)^k \le \sum_{k=0}^{n} \frac{1}{k!}x^k$. However, I am teaching basic limits, so ratio test isn't available yet until it's proved (much) later. For now, is there any other way to prove the series converges? Perhaps using partial sums and barebones $\epsilon-\delta$ proof of convergence? Do you think it would be fruitful for me to open a new question for this? Thanks! – EthanAlvaree Aug 18 '21 at 10:32
  • 1
    @EthanAlvaree Hi Ethan. You're welcome. For fixed $x$ there exists a number $K$ such that $k!\ge (2|x|)^k$ for $k>K$. So, $$\left|\sum_{k=0}^\infty \frac{x^k}{k!}\right|\le\left| \sum_{k=0}^K \frac{x^k}{k!}\right|+\sum_{k=K+1}^\infty \frac{1}{2^k}=\left|\sum_{k=0}^K \frac{x^k}{k!}\right|+\frac{1}{2^K}$$ – Mark Viola Aug 18 '21 at 14:02
  • Thanks again, Mark. I wanted to inquire about two things. 1. Why are the absolute values needed in your previous comment, and would it suffice to drop the absolute values? 2. I noticed you assumed $x \ge -1$ in your answer. Which step depends on $x \ge -1$, and what about the case $x <1$, what can we say in that case? Thanks again. I opened a separate question, if you have time to contribute: https://math.stackexchange.com/questions/4232341/why-does-a-boundedness-proof-of-expx-depend-on-the-sign-of-x – EthanAlvaree Aug 25 '21 at 02:02
47

You may just differentiate $$ f(x):=\log x-x, \quad x\geq1, $$ giving $$ f'(x)=\frac1x-1=\frac{1-x}x<0 \quad \text{for}\quad x>1 $$ since $$ f(1)=-1<0 $$ and $f$ is strictly decreasing, then $$ f(x)<0, \quad x>1, $$ that is $$ \log x -x <0, \quad x>1. $$

Olivier Oloa
  • 120,989
22

If you defined the logarithm as $$\log(x)=\int_{1}^{x}{\frac{1}{t}dt},$$ $$\frac{1}{x} \le 1 \; \text{ for }x\ge 1.$$ Hence, $$ \log(x)=\int_{1}^{x}{\frac{1}{t}\,dt} \le \int_{1}^{x}\!{1}\,dt =x-1 \le x.$$ If $0< x\le 1\;$ then you simply get $$\log(x)=\int_{1}^{x}{\frac{1}{t}\,dt}=- \int_{x}^{1}{\frac{1}{t}\,dt}\le 0 < x.$$

Mittens
  • 39,145
XPenguen
  • 2,301
17

I am assuming you know the derivative of $\log$.

Let $f(x)=\log x -x$. Then $$f'(x) = \frac 1x -1<0\ \ \forall x>1.$$ Moreover, $f(1) = -1<0$. So you have a function that starts negative at $x=1$, and decreases afterwards since its derivative is always negative. This means that $$f(x) = \log(x) - x <0\ \ \forall x>1,$$ which is what you wanted to show.

Hanno
  • 6,302
15

Taylor series give $$e^x = 1 + x + \frac{x^2}{2} + \frac{x^3}{6} + \cdots$$

Hence $e^x > 1+x > x$ for $x\geq0$, so $\log(e^x) > \log(x)$ since $\log$ is increasing. Hence $x > \log(x)$ for $x\geq0$.

wythagoras
  • 25,026
12

You even have $\;\log x \le x-1$, because $\log$ is a concave function, and the line with equation $y=x-1$ is the tangent to the graph of $\log$ at $(1,0)$. Hence: $$\log x \le x-1 <x. $$

Bernard
  • 175,478
9

Define $f(x) = \log x - x$. Now $f'(x) = \frac{1}{x}-1$ which is negative if $x > 1$. Thus $f$ is strictly decreasing on the interval $(1, \infty)$.

Now since $f(1) = \log 1 - 1 = 0-1 = -1$, we must have $f(x) < -1$ on $(1, \infty)$. Thus $\log x - x < -1 < 0$ on $(1, \infty)$. This implies $\log x < x$ when $x > 1$.

desos
  • 2,929
5

When $x=1$, $\log x=0<1=x$. Further, for $x>1$ we have $\frac{d}{dx}\log x=\frac{1}{x}<1=\frac{d}{dx}x$.

This shows that $x$ is larger than $\log x$ at $x=1$ and that $x$ grows faster than $\log x$ for $x>1$. Hence $x>\log x$ for $x\ge 1$.

4

different ways of doing this exercise certainly depend on what you wish to assume. suppose we take $\log x$ to be a continuous non-constant map $f:\mathbb{R}^+ \to \mathbb{R}$ satisfying $$ f(xy) = f(x)+f(y) \tag{1} $$ this immediately gives $f(1)=0, f(x)+f(\frac1{x})=0$ so $f$ is a non-trivial abelian group homomorphism with $\exists c\dot f(c) \ne 0$

(1) implies that for any integers $m,n \ne 0$ we have $$ \log \sqrt{[n]c^m}=\log c^{\frac{m}{n}}= \frac{m}{n} \log c \tag{2} $$ since for $\mathbb{R}^+\ni x \ne 1$ the set $\{c^{\frac{m}{n}}\}_{m,n \in \mathbb{Z}\setminus \{0\}}$ is dense in $\mathbb{R}^+$ we have, by continuity, $$ f(c^r)=r\log c $$ for any $r \in \mathbb{R}^+$

(2), together with the density of $\text{Im}(f)$ in $\mathbb{R}^+$ (1) implies that $f$ is order-preserving or order-inverting depending on the sign of $\log c$ and whether $c \gt 1$. thus to rule out the order anti-isomorphisms we require one further assumption, that $f((1,\infty)) \subseteq (0,\infty)$

suppose $f$ had a fixed point $\zeta \gt 1$. i.e a point for which as real numbers $$ f(\zeta) = \zeta $$ we will show this leads to a contradiction.

since $f(1)=0$ and $f$ is strictly monotonic and continuous the equation $f(x)=1$ has a unique solution, let us say $x=e \gt 1$.

since $\text{Im}(f)\subset \text{Domain}(f)$ we may define a sequence of functions $f_n$ with $\text{Domain}(f_{n+1})=F_{n+1} = \text{Im}(f_n)$ and $f_{n+1}=f_{|F_{n+1}}$ renaming $f$ as $f_0$ we have a sequence $F_n$ with $$ F_{n}=(e^n,\infty) \\ \bigcap F_n = \emptyset $$ but $\forall n \zeta \in \text{Image} (f_n)$, contradiction

since $f$ has no fixed point and $f(1) \lt 1$ we have our result

David Holden
  • 18,040
3

Note that the second derivative of $\ln(x)$ is $-\frac{1}{x^2}$, which is always negative. This means that any tangent line to the graph $y=\ln(x)$ will be greater than or equal to $\ln(x)$, equality only being achieved at the tangent point. We can then conclude that the tangent line $x-1$ is greater than or equal to $ln(x)$. Since $x>x-1$, $x>\ln(x)$ for any value of $x$.

Hrhm
  • 3,303
2

$\log_{10}x<x$ implies $x<10^x$ We can directly see it by observation and is true for all $x$. Or directly go for derivatives!

wythagoras
  • 25,026
1

If you're familiar with the Taylor Theorem, this is a short proof of the inequality:

Define $f(x)=\log(x)$ for $x>0$. I'm going to assume that the natural logarithm is the (only) function which verifies that $f(1)=0$ and $f'(x)=\dfrac{1}{x}$ (this is one of the many definitions we can give, others would be to say it's the inverse function of the exponential function, $e^{x}$, or saying it is the integral $\int_{1}^{x}\dfrac{1}{t}dt$, but that's not the problem now).

The second derivative of $f$ is $f''(x)=-\dfrac{1}{x^{2}}$, continuous in $(0,\infty)$.

Let's take a point $x>1$, since $f$ is a $\mathcal{C}^{2}$ function (its first and second derivatives exist and are continuous), the Taylor Theorem guarantees that there exists $c\in (1,x)$ which satisfies

$f(x)=f(1)+f'(1)(x-1)+f''(c)(x-1)^{2}$

Then

$\log(x)=\log(1)+\dfrac{1}{1}(x-1)-\dfrac{1}{c^{2}}(x-1)^{2}=x-1-\dfrac{1}{c^{2}}(x-1)^{2}$

But since $c^{2}>0$ and $(x-1)^{2}>0$, we know that

$\log(x)<x-1<x$, which is what we wanted to prove.

Even though the question was made only considering $x>1$, the result is also true when $x\leq 1$. The case $x<1$ would be solved in the same manner, whereas doing it while $x=1$ is just a matter of checking directly.

  • Typesetting hint: use a backslash (\log) to get $\log$ instead of $log$ (which looks like a product of $l,o,g$). The same goes for $\sin, \ker$, etc. – Théophile Jul 10 '18 at 19:29
1

You can just use Lagrange's theorem in the interval $[1,x]$: $$ f(x) - f(1) = f'(\xi) (x-1), \quad \xi \in [1,x], $$

which yields $ \log x = \frac{1}{\xi} (x-1) \leq x - 1 < x$.

PierreCarre
  • 20,974
  • 1
  • 18
  • 34
1

This is equivalent to prove that $e^x>x$.
It is obvious that $e^x>x$ if $x<0$ since the LHS is positive and the RHS is negative.
Suppose that for some $a\ge 0$, the inequality $e^a\le a$ holds.
Then $a\ge e^a\ge 1$ since $e^a\ge e^0$ because $a\ge 0$. But now we can see that $a\ge 1$ and again, $a\ge e^a\ge e^1$ and so $a\ge e$. We continue applying the same observation and conclude that $a\ge e^{^{e}}$ and so on, which means that $a$ is unbounded which is a contradiction.