I'm trying to solve this problem:
Suppose $a$ and $b$ are positive integers such that $a^2 + b^2$ is even and $a^3 + b^3$ is a multiple of 3. What is the largest positive integer that must divide $a^5 + b^5$?
My work so far is as follows:
If $a^2 + b^2$ is even, then $a$ being even implies $b$ being even, and $a$ being odd implies $b$ being odd (and vice versa). That is, $a \equiv b$ (mod 2).
If $a^3 + b^3$ is a multiple of 3: First, by Fermat's Little Theorem, we have $a^3 \equiv a$ (mod 3) and vice versa for $b$. As a result, we can conclude that $a + b \equiv 0$ (mod 3).
However, I don't know how to handle the $a^5 + b^5$. I've tried multiplying $(a^3 + b^3)(a^2 + b^2) = (a^5 + b^5) + a^2b^2(a + b)$, but that didn't seem to help. Can someone give me a few pointers on where to proceed?
Note: Testing a few examples, I think the answer is 6 ($(a, b) = (3, 3)$ yields $486$, $(a, b) = (2, 4)$ yields $1056$, ...). Rather than just the answer to this problem, I'm more concerned about the proof methods that I should develop to tackle problems similar to this in the future. I'd greatly appreciate if someone could provide a proof, relying mainly on theorems/lemmas related to modular arithmetic, that leads to the answer to this problem.