Consider the Heaviside function that is undefined in zero, i.e. $$H(t)=\begin{cases} 1&t>0 \\ 0&t< 0\end{cases}$$ Now consider a sequence of $H^1(\Omega)$-functions $u_n\to u$ in the norm of the sobolev space and $u_n(x) \in [-1,1]$ for a.e. $x\in \Omega$ (bounded domain). Let $f$ be a smooth testfunction. Can we say that $$\int_\Omega f\cdot H(u_n) \nabla u_n \to \int_\Omega f \cdot H(u) \nabla u$$
Of course one would try to apply Lebesgue's dominated convergence theorem (for a subsequence and a subsequence of that). The issue is that I'm not sure to find an integrable bound. The set were $u$ is zero and consequently $H(u)$ is undefined could have positive measure. However on that set I'm not sure how to find a good bound. On the other hand, if the set where $u$ is zero we find that the gradient is $0$ too and we can bound it on that set by $1$.
Having that said, I would think of estimating $|H(u_n)\nabla u_n|^2\leq |\nabla u_n|^2$. The latter sequence converges in $L^1(\Omega)$ (by assumption). Hence a generalized version of Lebesgues theorem yields that $H(u_n)\nabla u_n \to H(u)\nabla u$ in $L^2$ and consequently we can pass to the limit in the integral.
Is my reasoning correct? Does anyone have an idea on how to attack the problem without the gradient?