2

The question arose from the following proof:

$f(x)=x^3+x+1$ is injective:

Let $a,b\in \mathbb{R}$ and $f(a)=f(b)$:

$f(a)=f(b)\Leftrightarrow a^3+a+1=b^3+b+1\Leftrightarrow a^3+a=b^3+b \Rightarrow a=b$

Is $a^3+a=b^3+b\Rightarrow a=b$ mathematically rigorous enough? I don't know the concrete mathematical argument why $a^3+a=b^3+b\Rightarrow a=b$ is true?

cedric
  • 139

3 Answers3

10

What do you mean by "rigorous enough"? No argument is in your step $a^3 + a = b^3 + b \Longrightarrow a = b$. Some reasoning should be included there.

Method 1. Use calculus to show $x^3 + x$ is an increasing function. That it should be an increasing function is obvious by staring at a graph of it, and calculus allows you to explain why the function is increasing.

Method 2. Use algebra to show $x^3 + x$ is an increasing function. If $x < y$, then we want to show $x^3 + x < y^3 + y$. Write $y = x + h$ for $h > 0$. Then $$ y^3 + y = (x+h)^3 + (x+h) = x^3 + 3x^2h + 3xh^2 + h^3 + x+h, $$ so $$ (y^3 + y) - (x^3 + x) = 3x^2h + 3xh^2 + h^3 + h. $$ We want to show the right side is positive for all $h > 0$, no matter what $x$ is. View the right side as a quadratic polynomial in $x$: $$ (3h)x^2 + (3h^2)x + (h^3 + h). $$ Its discriminant is $(3h^2)^2 - 4(3h)(h^3 + h) = -3h^4 - 12h^2 < 0$, so that quadratic polynomial in $x$ has no real roots: its values as a $x$ varies are either always positive or always negative. At $x = 0$ its value is $h^3 + h$, which is positive, so its values at all real $x$ are positive.

Method 3. Use algebra to show that if $a^3 + a = b^3 + b$ then $a = b$. Rewrite the equality as $a^3 - b^3 = b - a$. Factoring the left side, $$ (a-b)(a^2 + ab + b^2) = b-a = -(a-b). $$ If $a \not= b$ then the term $a-b$ is nonzero, so you can divide by it to get $$ a^2 + ab + b^2 = -1. $$ If $a$ or $b$ is $0$ then this equation becomes $b^2 = -1$ or $a^2 = -1$, which have no solution (in real numbers). If $a$ and $b$ are both nonzero and have the same sign, then all three terms on the left side are positive, so their sum is not $-1$. If $a$ and $b$ are both nonzero with opposite signs, then rewrite the left side: $$ (a+b)^2 - ab = -1. $$ Here $(a+b)^2 \geq 0$ and $-ab > 0$ (since $ab$ is negative due to the opposite signs), so again such an equation is impossible.

KCd
  • 46,062
9

If $a^3+a=b^3+b$, then $$(a^3-b^3)+(a-b)=0 \quad \Leftrightarrow $$ $$(a-b)(a^2+ab+b^2+1)=0.$$ However, $a^2+ab+b^2+1=(a+\frac{b}{2})^2+\frac{3b^2}{4}+1>0$. This immediately implies that $a-b=0$, i.e. $a=b$. Please let me know does it make sense.

RFZ
  • 16,814
0

Since

$\quad a^3+a=b^3+b \text{ implies } (a = 0 \land b = 0) \; \displaystyle{\Large \lor} \; \frac{a^2+1}{b^2+1} = \frac{b}{a}$

we can write, with $a \ne 0$

$$a^3+a=b^3+b \; \text{ iff } \; \frac{a^2+1}{b^2+1} = \frac{b}{a}$$

Examining the fraction you must have that both $a$ and $b$ are of the same sign.

Again, examining the fraction, assuming $0 \lt a \lt b$ leads to a contradiction.

Similarly, assuming any of

$\quad 0 \lt b \lt a\;$ or
$\quad b \lt a \lt 0\;$ or
$\quad a \lt b \lt 0$

leads to a contradiction.

CopyPasteIt
  • 11,366