If something increase $50$ to $200$, I know that it is $400\%$ increment using common sense.
I can get this using $\dfrac{200}{50}\times 100\% = 400\%$.
If something increase $50$ to $52$, I know that it is $4\%$ increment using common sense.
But if I apply the same logic, $\dfrac{52}{50}\times 100\% = 104\%$.
What is the problem in my logic?