26

Let $ a, b, c, d $ be natural numbers such that $ ab=cd $. Prove that $ a^2+b^2+c^2+d^2 $ is not a prime.

I am clueless on this one. I tried contradiction, but didn't get anywhere.

Can you help?

Edit: I understand natural numbers to be strictly positive, excluding $0 $.

TooTone
  • 6,343
  • 7
    The only thing I can think of is to add $0=2ab-2cd$, so that you have the sum of two squares: $(a+b)^2+(c-d)^2$. Not sure where to go from there, however. But perhaps it gives you an idea. – Harald Hanche-Olsen Feb 18 '14 at 22:03
  • What if a=c=0 and b=d=1 ?? It looks like your problem's text could use a rewriting. Note: I was taught in school that $0 \in \mathbb{N}$, hence the need to use the symbol $\mathbb{N}^*$ for the set of natural numbers without 0. – DanielC Feb 18 '14 at 22:28
  • I didn't know that 0 is a natural number. – Stefan4024 Feb 18 '14 at 22:28
  • 2
    @Stefan: I didn't know that 0 is not a natural number. – TMM Feb 18 '14 at 22:31
  • @TMM Now you know ;) – Stefan4024 Feb 18 '14 at 22:32
  • According to Wikipedia, it's a debatable subject :) http://en.wikipedia.org/wiki/Natural_number – DanielC Feb 18 '14 at 22:33
  • @DanielC In tens of years of experience in math contests and even more years of studying math I've never met the fact that $0$ is a natural number. As far as I know the set of natural numbers + zero is denoted by $\mathbb{N_0}$. – Stefan4024 Feb 18 '14 at 22:48
  • 3
    @Stefan4024 the modern usage is almost entirely the convention that $0\in\mathbb{N}$ with something like $\mathbb{N}^+,\mathbb{N}^{>0},\mathbb{Z}^+$ or even just $\mathbb{N}\setminus{0}$ denoting the set of positive integers. Of course there are some that do not use this convention. This convention is used, among other reasons, because it makes $\mathbb{N}$ into a commutative monoid which has nicer properties than only being a semigroup if $0$ is not included. – Dan Rust Feb 18 '14 at 23:53

5 Answers5

32

Since $d=\frac{ab}{c}$, we obtain

$$ a^2+b^2+c^2+d^2=\frac{(a^2+c^2)(b^2+c^2)}{c^2}=\left(\left(\frac{a}{a'}\right)^2+b'^2\right)\left(\left(\frac{b}{b'}\right)^2+a'^2\right). $$ where $c=a'b'$ such that $a' \mid a$ and $b' \mid b$.

Ma Ming
  • 7,482
21

Suppose not. Note $p:=a^2+b^2+c^2+d^2=(a+b)^2+(c-d)^2=(a-b)^2+(c+d)^2$. That is, we expressed $p$ in two ways as a sum of two squares.

But since $p$ is prime, it can be expressed as the sum of two squares in at most one way, up to interchanging the numbers. This corresponds to the fact $p$ has a unique prime factorization in the ring of Gaussian integers $\mathbb{Z}[i]$: $p=(a+bi)(a-bi)=a^2+b^2$.

Therefore either $a+b=a-b$ and $c-d=c+d$, which is impossible, or $a+b=c+d$ and $c-d=a-b$ which implies $a=c$ and $b=d$ and therefore $2|p$. Contradiction because $p>2$.

J.R.
  • 17,904
  • If you want to reach a contradiction that $p > 2$, you should first exclude the case $p = 2$ (and then assume that $p > 2$). – TMM Feb 18 '14 at 22:28
  • 2
    This answer is an overkill for this question. – achille hui Feb 18 '14 at 22:29
  • It seems we are talking about natural numbers without $0$, so $p>2$ is clear. @TMM – J.R. Feb 18 '14 at 22:30
  • @TooOldForMath Yes, the update excludes $0$, so now it is clear. (If $0$ is allowed then there are many solutions with $b = d = 0$.) – TMM Feb 18 '14 at 22:33
  • 6
    @achillehui: I don't feel that the Gaussian integers are a very strong tool and they really trivialize this problem. – J.R. Feb 18 '14 at 22:35
3

Let $N := a^2 + b^2 + c^2 + d^2$

Note that $bd(a^2 + c^2) = ac(b^2 + d^2)$, while $ac < a^2+c^2$, so that $$lcm(a^2 + c^2, b^2 + d^2) < (a^2 + c^2)(b^2 + d^2)$$ This tells us that $a^2 + c^2$ and $b^2 + d^2$ are not coprime, and so $$ 1 < gcd(a^2 + c^2, b^2 + d^2)< N $$

Finally this $gcd|N$, which completes the proof.

Tsamps
  • 31
-3

Quick and dirty proof:

We start with the information given. a^2 + b^2 + c^2 + d^2 = k such that k is prime (we will contradict this) ab = cd

We rewrite the terms of k in the following way: (cd/b)^2 + b^2 + (ab/d)^2 + (ab/c)^2 = (c^2 * d^2)/b^2 + b^2 + (b^2)(a/d)^2 + (b^2)(a/c)^2 [1] The last three terms in [1] are multiples of b^2. We have already shown, in those last three terms, that c^2 and d^2 are both multiples of b^2. Thus the first term of [1] is a multiple of b^4, since we the product of c^2 and d^2 in the numerator. Dividing by b^2 leaves it as a multiple of b^2. Hence all terms in k are a multiple of b^2 and k cannot be a prime.

PS. I have no idea how to format equations on here. Truly your forgiveness I implore. PSS. If anyone notices a mistake please let me know!!!

  • 2
    I'm afraid your proof is not correct. $a^2+b^2+c^2+d^2$ need not be a multiple of $b^2$, nor of $a^2,c^2$ or $d^2$ (note that $\frac ad$ and $\frac ac$ need not be integers). – Nils Matthes Feb 20 '14 at 08:35
  • I thought there was something fishy with that 'proof'. Cheers. – Dion Bridger Feb 20 '14 at 17:41
-4

A rather odd way to perhaps get there..

ab = cd

no prime > 2 is even.

sum of any even number of odd numbers is even.

square of odd numbers is odd and of even numbers is even.

so either 1 or 3 of a, b, c, d must be odd if we are possibly to get a prime.

It cannot be that only one of them is odd or ab != cd. So we need 3 of them odd.

If a,b,c are odd then ab/c = d which is a contradiction. QED :)