2

I've encountered the following problem:

Let $(x,y,z)$ be a Pythagorean triple of positive integers such that $gcd(x,y)=gcd(x,z)=gcd(z,y)=1$, and y is odd. Prove that there exists $u>v>0$ integers with $gcd(u,v)=1$ such that $(\frac xz,\frac yz) = ({u^2-v^2 \over u^2 + v^2},{2uv \over u^2+v^2})$ and deduce that $(x,y,z) = ({u^2-v^2 \over 2},uv,{u^2 + v^2 \over 2})$

Remark - your'e allowed to use the fact that all rational solutions to $(a,b)$ to $a^2+b^2 = 1$ are given by the set $\{({u^2-v^2 \over u^2 + v^2},{2uv \over u^2+v^2}): u,v\in \mathbb{Z} \}$

The first step is very easy, since: $$x^2+y^2=z^2$$ $$\left( \frac xz \right)^2 +\left( \frac yz \right)^2 = 1$$ Then using the statement from the remark, there are $u,v \in \mathbb{Z}$ such that $(\frac xz,\frac yz) = ({u^2-v^2 \over u^2 + v^2},{2uv \over u^2+v^2})$. It is easy to see that theyr'e either both positive or both negative, since: $$1=sgn\left(\frac yz\right)=sgn\left({2uv \over u^2+v^2}\right)=sgn(2uv)=sgn(u)sgn(v)$$ If both are negative we can set $$a=-u, b=-v$$ and then both $a,b>0$ and $(\frac xz,\frac yz) = ({a^2-b^2 \over a^2 + b^2},{2ab \over a^2+b^2})$, so we may assume that $u,v>0$

It is given that $y$ is odd, and we know that: $$y = z\cdot{2uv \over u^2+v^2}$$ so $u^2+v^2$ must be even. It follows immediately that either both $u,v$ are odd or both are even.

I couldn't solve the rest of the question - I didn't manage to show that both $u,v$ are odd nor that $gcd(u,v)=1$. Also I couldn't show that $(x,y,z) = ({u^2-v^2 \over 2},uv,{u^2 + v^2 \over 2})$.

Any suggestions?

amirbd89
  • 1,052

1 Answers1

3

The following has been called the fundamental theorem of arithmetic: For integers $a,b,c$ with $c\ne 0,$ if $c|a b$ and $\gcd (c,a)=1$ then $c|b.$ From this a great deal follows, including unique prime decomposition, from which we have :If $a,b$ are co-prime positive integers and $a b$ is the square of an integer then $a,b$ are squares of integers. Now for your Q. We have $$y^2=z^2-x^2=(z+x)(z-x)$$ with $y$ odd, and with $x,y,z$ pair-wise co-prime. First, we deduce that $$\gcd (z+x,z-x)=1.$$ For suppose instead that $p$ is prime and $p$ divides both $z+x$ and $z-x.$ Then $$p|(z+x)+(z-x)=2 z \quad \text {and }\; p|(z+x)-(z-x)=2 x.$$ But $p$ must be odd (because $p|(z+x)(z-x)=y^2$ and $y^2$ is odd.) So $p|z$ and $p|x.$ But this is impossible because $\gcd (z,x)=1.$ Second, because $(z+x)$ and $(z-x)$ are co-prime positive integers whose product is a square ($y^2$), there are positive integers $u,v$ with $$z+x =u^2 \quad \text { and } z-x=v^2.$$ This gives $$z=((z+x)+(z-x))/2=(u^2+v^2)/2$$ $$\text { and }\; x=((z+x)-(z-x))/2=(u^2-v^2)/2$$ $$\text {and } y^2=(z+x)(z-x)=u^2 v^2\quad \text { so }\; y= u v.$$ We have $u>v$ because $u^2-v^2= 2 x^2>0.$ Finally, to show that $u,v$ are co-prime, let $w= \gcd (u,v).$ We have $w|u v=y$ so $w$ is odd. And $w|(u^2+v^2)=2 z$ and $ w|(u^2-v^2)=2 x,$ so $w|z$ and $w|x$ (because $w$ is odd.) So $w|\gcd (z,x)=1.$