Every math is based on a set of assumptions. If you are using regular math that people generally agree on then you go with Mohammad's answer: division by zero is undefined and you can't use it. Let's say you want to make up your own math where division by zero is defined, and everything else about math is still the same. One of the first things people ask about new sorts of math is whether it's consistent. If you can take your assumptions and apply proper logic to it and come up with something that contradicts any of your assumptions or logical results, then your new system is not consistent. Generally speaking, that means the system is flawed and it's not helpful to apply logic to it to get good results.
Suppose for instance I try to say that 1+1=1. In regular math, it's wrong, but in a made up system, it could be right. One such system is that 1 is the only number. 1+1=1, 1-1=1, 1*1=1, 1/1=1. This system is consistent.
So what if division by zero is defined? In that case, I'm not sure I follow your original argument. Are you saying that 0/0 = 0? so that the zero on the right side of a*0=0 turns into a*0=0/0 and then using a=0/0 turns the whole thing into 0/0 * 0 = 0/0? I guess that all works if 0 is the only number, but you will run into trouble if you start to allow other stuff. For instance, in the real numbers a/a=1. So 0/0=1, but if 0/0 is also 0 then we've hit a contradiction.
It doesn't matter what people have tried to define division by zero as; it's always created inconsistency. That's why they leave it undefined.
Now on to whether it's a paradox.. I didn't find an official definition for mathematical paradoxes, but I did find this general one on google:
a seemingly absurd or self-contradictory statement or proposition that when investigated or explained may prove to be well founded or true.
So no; it's just inconsistent.