1

Consider any three consecutive positive integers. Prove that the cube of the largest cannot be the sum of the cubes of the other two.

Work: I tried to prove via contradiction.

I made three integers, k, k+1, and k+2. Then set the equation$(k+2)^3 = (k+1)^3 + k^3$

This equation is then expanded to $k^3+6k^2+12k+8 = 2k^3+3k^2+3k+1$ Ultimately, I get to a dead end $k^3-3k^2+9k = 7$.

I do not know where to go from this and it just seems that I took the wrong route in the beginning. I am currently learning about linear combinations and the division algorithm as well as Euclidean Algorithm, but I do not see anyway I can use those on this problem.

mrQWERTY
  • 595

2 Answers2

3

Small trick: Let us consider $3$ integers of the form, $(a-1), a, (a + 1)$ and now suppose the cube of the largest is the sum of the cubes of the other two.

$(a + 1)^3 = a^3 + (a - 1)^3 \implies a^3 + 3a^2 + 3a + 1 = 2a^3 - 3a^2 + 3a - 1\implies a^3 - 6a^2 - 2 = 0 \implies a^2(a - 6) = 2$

From here, you can get to a contradiction in a million ways. Here's one:-

Now $a^2 \ge 0$ so for the product to be positive $(a - 6) \ge 0 \implies a \ge 6$. But then $a^2\ge 36$. The two products can never be equal to $2$ leading to a contradiction.

Ishfaaq
  • 10,034
  • 2
  • 30
  • 57
1

You have a cubic (actually, you made an arithmetic error, it should be $k^3 - 3k^2 -9k = 7$); rewrite it as $k^3-3k^2-9k-7=0$. You want integer roots for the cubic. Any integer roots must be factors of $7$, so there are only four possibilities, and checking them, there are no roots.

By the way, the arithmetic is slightly easier if you use $k+1$, $k$, and $k-1$.

rogerl
  • 22,399