13

Does anybody have a proof of the concavity of the $\log{x}$ that does not use calculus?

Yul Inn
  • 139
  • 1
    use the inequality of Jensen – Dr. Sonnhard Graubner Feb 22 '16 at 16:24
  • A function $f$ is concave is for any $x_0, x_1 \in \mathbb{R}^2$ and $t \in [0,1]$, $$ f((1 - t) x_0 + t x_1) \geq (1 - t) f(x_0) + t f(x_1) $$

    Show that $$ \log ((1-t) x_0 + t x_1) \geq (1-t) \log (x_0) + t \log(x_1)) $$ , i.e. show that

    $$ \log ((1-t) x_0 + t x_1) \geq \log (x_0 ^ {1-t} x_1 ^ t) $$

    – Shailesh Feb 22 '16 at 16:36
  • What I am looking for is a proof of your last inequality. Using Jensen's inequality to prove that requires assuming that $\log{x}$ is concave. – Yul Inn Feb 22 '16 at 16:52
  • Possibly interesting / useful: If you can prove the convexity of the exponential function, you can recover the concavity of the logarithm relatively easily. – πr8 Feb 22 '16 at 17:02
  • Yes, I'd be just as interested in a proof of convexity of the exponential function without calculus :-). – Yul Inn Feb 22 '16 at 17:14
  • Because $\log$ is continuous, it's enough to show that $\frac 1 2(\log x_1+\log x_2)\le \log(\frac{x_1+x_2}2)$ which, after exponentiation, is just $(x_1-x_2)^2\ge 0$. – A.S. Feb 22 '16 at 18:03
  • That's good, thank you. I was trying to avoid limits, but this answer and the AM-GM answer below (along the same lines as yours) have enough of the "flavor" of what I wanted. – Yul Inn Feb 22 '16 at 21:30

1 Answers1

3

A proof by Cauchy induction, which does not involve calculus, that the arithmetic mean is no less than the geometric mean is given here. Let $\alpha$ be any real number between $0$ and $1$. Then there is a sequence of natural numbers $m(n)$, with $0\leqslant m(n)\leqslant n$ ($n=1,2,...$ ), such that the rational numbers $m(n)/n$ converge to $\alpha$ as $n\to\infty$. For given $n$ and positive reals $x$ and $y$, consider the arithmetic and geometric means of the $n$ positive reals $x_1,...,x_n$, where $x_1=\cdots=x_{m(n)}=x$ and $x_{m(n)+1}=\cdots=x_n=y$. The AM-GM inequality for $x_1,...,x_n$ is $$\frac{x_1+\cdots+x_n}n\geqslant(x_1\cdots x_n)^{1/n},$$or $$\frac{m(n)}{n}x+\left(1-\frac{m(n)}n\right)y\geqslant x^{m(n)/n}y^{1-m(n)/n}.$$By the continuity of the arithmetical functions involved, we have in the limit as $n\to\infty$ that$$\alpha x +(1-\alpha)y\geqslant x^{\alpha}y^{1-\alpha}.$$Now take the logarithm, and we are home.

John Bentin
  • 18,454
  • That's nice! Any ideas for an even more elementary proof that doesn't involve limits? – Yul Inn Feb 22 '16 at 21:14
  • @YulInn: The most natural and elemenary definition of the natural logarithm---or its inverse, the exponential---is via calculus, or at least as a limit. The handicap of not using calculus is artificial; so it is not surprising that the above proof is circuitous. If only rational numbers featured as base and index, then limits would not be needed in the proof. Limits in one sense or another are essential in constructing the reals from the rationals. Do you expect a purely algebraic proof? – John Bentin Feb 22 '16 at 22:06
  • I was looking for a proof/explanation that I could present to pre-calculus students who understand $p^q$ where $p>0$ and $q$ are rationals and that the log function is the inverse of the exponential function but who perhaps haven't seen functions such as $e^x$ or $\ln{x}$. Your observations are very helpful and, as a result, I think I will have to content myself with an argument for rationals only like the one you gave above. Thank you. – Yul Inn Feb 22 '16 at 22:37
  • @YulInn: You might consider adding your explanation about the teaching background as an edit to the question. Many people on this site expect a context for each question; and the lack of context probably explains the votes to close the question. – John Bentin Feb 23 '16 at 09:14