Suppose $a=b$.
Multiplying by $a$ on both sides gives $a^2 = ab$. Then we subtract $b^2$ on both sides, and get
$a^2-b^2 = ab-b^2$.
Obviously, $(a-b)(a+b) = b(a-b)$, so dividing by $a - b$, we find
$a+b = b$.
Now, suppose $a=b=1$. Then $1=2$ :)
Suppose $a=b$.
Multiplying by $a$ on both sides gives $a^2 = ab$. Then we subtract $b^2$ on both sides, and get
$a^2-b^2 = ab-b^2$.
Obviously, $(a-b)(a+b) = b(a-b)$, so dividing by $a - b$, we find
$a+b = b$.
Now, suppose $a=b=1$. Then $1=2$ :)
The mistake is that from the fact that $$xy = zy$$ you cannot conclude that $x=z$. You can only conclude that $x=z$, if $y \neq 0$, i.e., $$xy = zy \implies x=z \color{red}{\text{ or } y=0}$$
In your case, you had $$(a-b)(a+b) = b(a-b)$$ Since $a = b$, we have $a-b = 0$ and hence you cannot cancel $a-b$ from both sides, i.e., from the fact that $$(a-b)(a+b) = b(a-b)$$ we can only conclude that $$a+b = b \color{red}{\text{ or } a-b=0}$$ The $\color{red}{\text{later}}$ is what is valid in your case, since you started with the assumption $a=b$, i.e., $a-b=0$.
By a simple example:
if( A = 0 ) A * 5 = A * 7 and by dividing by A we have 5 = 7.
Then we can dividing by A if A not equal to zero.