Show that $\sqrt n$ is irrational if $n$ is not a perfect square, using the method of infinite descent.
I know how to prove this by doing a contradiction proof and using The Fundamental Theorem of Arithmetic, but now I'm asked to use infinite descent to prove it. Then the very next problem says "Why does the method of the text fail to show that $\sqrt n$ is irrational if $n$ is a perfect square?" I'm confused by this. Any hints or solutions are greatly appreciated.
I was thinking of the standard argument, let $\sqrt n = {a\over b}$ where $gcd(a,b)=1$ and then through some algebra arrive at a common factor for both $a$ and $b$ which contradicts the fact that $gcd(a,b)=1$ and so we can apply this over and over again, but then I don't understand how the next problem says to explain why this method fails.