Statement: Suppose $f$ and $g$ are functions with domain $\mathbb{R}$ and we want to show that if
$$\lim_{x \rightarrow a} f(x) = b\ \ \text{and}\ \lim_{x \rightarrow b} g(x) = c,\ \text{then}\ \lim_{x \rightarrow a} g(f(x)) = c.$$
I know that this statement is false $\bigg($let $f(x)=0$ and $g(x)=\begin{cases}1&x= 0\\0&x\neq0\end{cases}$, as $x\to 0$$\bigg)$.
But I don't understand where the following attempt to prove the statement fails.
"Proof": Since $\lim_{x \rightarrow b} g(x) = c$, $$\forall\epsilon>0,\ \exists\delta_1,\ 0<|x-b|<\delta_1\Rightarrow |g(x)-c|<\epsilon.\tag 1$$ Since $\lim_{x \rightarrow a} f(x) = b$, we can choose $\delta_2$ such that $$0<|x-a|<\delta_2\Rightarrow |f(x)-b|<\delta_1.$$ Now, using implication $(1)$, $$\forall\epsilon>0,\ \exists\delta_2,\ 0<|x-a|<\delta_2\Rightarrow |f(x)-b|<\delta_1 \Rightarrow |g(f(x))-c|<\epsilon,$$ which is $\lim_{x \rightarrow a} g(f(x)) = c$.
The only suspicious point that I see is that to use $(1)$, it should be the case that $f(x)\neq b$.
But the counterexample relies on the discontinuity of $g$.