2

I am studying computer science in the first term.

I have to proof the following inequality:

$$ \frac {x_1 + \cdots+ x_n}{n} \ge \sqrt[n]{x_1 x_2 \cdots x_n}$$

$x$ can be any positive real Number: $ x \in \mathbb{R},x \gt0 $

I try to bring it in a form that is like Bernoulli's Inequality, but then realized that this is pretty much nonsense.

So now I am having a hard time to prove it through induction.

Beginning of the induction is clear: $n \rightarrow 1$

$ \large\frac {x_1}{1} \ge x_1$

Next Step: $n \rightarrow n+1$

For the start, lets write the left side of the inequality as $$x_a(n) = \frac{1}{n} \sum_{k=1}^n x_k$$ $$x_g(n) = \prod_{k=1}^n x_k^\frac{1}{n}$$

So now we have to prove:

$$x_a(n+1) = \frac{1}{n+1} \sum_{k=1}^{n+1} x_k$$ $$x_g(n+1) = \prod_{k=1}^{n+1} x_k^\frac{1}{n+1}$$

$x_a(n+1) \ge x_g(n+1)$

Can you just give me a strategy on that?

Thanks.

  • 1
    This is famous inequality. Proofs are everywhere: Books, google, here, there, in your notes, etc, etc, etc –  Nov 06 '14 at 10:25
  • 1
    http://en.wikipedia.org/wiki/Inequality_of_arithmetic_and_geometric_means –  Nov 06 '14 at 10:28
  • 2
    The standard way of proof is to first prove it when $n$ is a power of $2$ by induction. Then show that any other case follows from the next power of two (e.g. the case $n = 5$ follows from the case $n = 8$). Search around for "Arithmetic mean geometric mean" or "AM-GM". – Arthur Nov 06 '14 at 10:29
  • One (my favourite, in fact) way to realize this inequality is by noting that the minimization of the $\Bbb R$-perimeter (total length of the edges) of a hypercuboid in $\Bbb R^n$ with fixed volume is a hypercube in $\Bbb R^n$. – Balarka Sen Nov 06 '14 at 10:31
  • It is known as AM-GM Inequality.. It's an important and famous inequality.. – Argha Nov 06 '14 at 10:38
  • Thank you . So i read there is indeed a proof using binomial inequality. Interesting.. But it is way more complex than i thought. – Falco Winkler Nov 07 '14 at 08:42
  • thank you i found this thread as well – Falco Winkler Feb 20 '15 at 20:56

5 Answers5

4

Hints: The standard way to solve this: Put $f(x) = x_1 x_2 ... x_n $. Let

$$ D = \{ x \in \mathbb{R}^n: x_1 + ... + x_n = n , x_i \geq 0 \}$$

Now, use LAgrange multipliers to find the max of $f$ on the set $D$.

Conclude.

4

Many ways, here is another one:

Note that the inequality is homogeneous, (we can scale all variables by any positive value and the inequality remains unchanged). So WLOG we can set $x_1 x_2\dots x_n = 1$. Then we need to show only that $x_1+x_2 + \dots + x_n \ge n$ under this condition.

Consider the function $f(t) = t-1-\log t$. Note that our inequality is the same as $f(x_1)+f(x_2)+\dots + f(x_n)\ge 0$, so it is enough to show $f(t)\ge 0$ for positive $t$. This is easily done using one variable calculus or if you know the series expansion of $\log (1+t)$ or $e^t$.

Macavity
  • 46,381
2

An elementary way:
By squaring both relations on the left side of the implication, it's easy to see this holds: $$x+y=x'+y', |x-y|\ge|x'-y'|\implies x'y'\ge xy.$$ Now set $a=(x_1+\ldots+x_n)/n$. Find $x_i\ne a$ so that $|x_i-a|$ is the smallest possible. If $x_i>a$, take any $x_j<a$, then by using the above inequality you can replace $x_i$ by $a$ and $x_j$ by $x_j{+}(x_i{-}a)\le a$; similarly for $x_i<a$. Repeat until all $x_i = a$.

user2345215
  • 16,422
2

I think the easiest way to prove this is to use the concavity of ln:

$(ln(x))" = -\frac{1}{x^2} < 0$ => ln concave

Then you know that if: $\sum_{k=1}^n a_k = 1$ : $ln(\sum_{k=1}^n a_k*x_k) \geq \sum_{k=1}^n a_k*ln(x_k)$

Now set: $a_k =\frac{1}{n}$ for $k=1..n$, and see what you get, knowing that: $ a*ln(x)=ln(x^a)$ and $\sum_{k=1}^n ln(x_k) = ln(x_1*x_2*...*x_n) $

mvggz
  • 1,965
1

Another Solution:

First, a result: for any $x,y$ non-negative reals, $\alpha + \beta = 1 $, then

$$ x^{\alpha}y^{\beta} \leq \alpha x + \beta y $$

To show this, put $f(t) = (1 - \beta) + \beta t - t^{\beta} $. Show this function decreases on $[0,1]$ and then replace $t$ with $\frac{y}{x}$. . Now, as for your problem, we use induction on $n$. The base case is Young, and now let us suppose result holds for $n$, we show it holds for the case $n+1$. Indeed,

$$x_1^{\alpha_1}...x_{n+1}^{\alpha_{n+1}} = [ x_1^{ \frac{\alpha_1}{\alpha_1 + ... + \alpha_n}}...x_n^{ \frac{\alpha_n}{\alpha_1 + ... + \alpha_n}} ]^{\alpha_1 + ... + \alpha_n}x_{n+1}^{\alpha_{n+1}} \leq (\alpha_1 + ... + \alpha_n )x_1^{ \frac{\alpha_1}{\alpha_1 + ... + \alpha_n}}...x_n^{ \frac{\alpha_n}{\alpha_1 + ... + \alpha_n}} + \alpha_{n+1}x^{n+1}$$

$$ \leq (\alpha_1 + .... + \alpha_n)[ \frac{\alpha_1}{\alpha_1 + ... + \alpha_n}x_1 + ... + \frac{\alpha_n}{\alpha_1 + ... + \alpha_n}x_n] + x_{n+1}\alpha_{n+1}$$

By induction hypothesis. But, the the above is obviously equal to

$$ = \alpha_1 x_1 + .... + \alpha_{n+1}x_{n+1} $$

Hence, your result follows by math induction.