Given that $f(0,0) = 0$:
How is it possible that $$\frac{∂f}{∂x}(0,0)=\lim\limits_{h\to 0}\frac1h\big(f(0+ h,0)−f(0,0)\big)=\lim\limits_{h\to0}\frac1h\!\cdot\!0=0$$
I assume they did $$\frac{\partial f}{\partial x}(0,0)=\lim\limits_{h\to0}\frac1h\big(f(0+h,0)−f(0,0)\big)=$$ $$=\lim\limits_{h\to 0}\frac1h\cdot\lim\limits_{h\to0}\big(f(0+h,0)−f(0,0)\big)=$$ $$=\lim\limits_{h\to0}\frac1h\!\cdot\!0=0$$
But $$\lim\limits_{h\to0}\frac1h=\infty$$
So how does zero times infinity equal zero? I thought multiplication with infinity was undefined.
Edit: The suggested post doesn't answer my question since my assumption is wrong. I wanted help with understanding how the partial derivative with respect to x is 0. Since this site requires us to show our own work I had to show you my take on it which is clearly wrong.The title has also been edited now to better suit the actual problem.