A few days ago, I told my friend that I wanted to learn trigonometry by the end of $2018$. His immediate reply was that I should go on if I could solve this mathematics problem:
a and b, given that $a\text{ and } b \in \mathbb N$ and $\left( \left( a^2+b^2 \right) \div \left( ab+1 \right)\right) \in \mathbb N$, then prove that $\sqrt{\frac {\left( a^2+b^2 \right) }{ \left( ab+1 \right)}} \in \mathbb N$
Over the last few days, I went about trying to see if I could prove it. (My friend has a tendency to occasionally trick me with unsolvable math problems) this is what I proved:
- If $a \text{ xor } b=0$, it works.
- If $b=a^3 \text{ and } a \neq 1$, it works.
- Besides $n=0$, $x^n=a \text{ and } x^{n+2}=b $will work
To recap, one number must be $2$ powers greater than the other (i.e. $32$ and $128$. [$2^5$ and $2^7$]).
Or, at least what I thought... When I sent an email to him, he said that I missed 2 very seldom noticed things and that everyone he had ever asked the problem missed it. Which either means he intentionally made the problem too hard for my intelligence ( which is -spoiler alert- very low, compared to you people), or he is just making fun of me for not noticing his problem is impossible. Either way, a plethora of mathematicial geniuses whom are much more experienced than me at algebra could solve this easily. Could someone tell me what I'm missing?? Thanks in advance. (And DON'T tell my friend about this!!)