In Ivan Niven's book on "Introduction to the theory of numbers", there is a question in the first chapter that has been boggling me.
Given $p$ is an odd prime and $(a,b) = 1$ where $(a,b) = \gcd(a,b)$, show that $$ \left(a+b,\frac{a^p + b^p}{a+b} \right) = 1 \mbox{ or }p$$
To gain intuition, I started with $p=3$ case. I was able to prove the result by eventually showing that the above gcd was equal to $(a+b,3)$. My guess therefore is that we should be able to show that this gcd is equal to $(a+b, k(a+b)+p(u))$ where $(u,a+b)=1$. However I have not been successful in this endeavor. Perhaps there is an alternate strategy...
I would appreciate any hints/tips/ideas on how to proceed. If I get it, i'll post my method as an answer.