I show that $E_s$, given by
$$E_s = \prod_i^s (1-\alpha X_i^2)^2,$$
almost surely converges to infinity for any $\alpha >0 $. The proof for convergence in probabilty is similar.
Define
$$Y_i=2\log(|1-\alpha X_i^2|).$$
The expectation of $Y_i$ is finite and given by
$$\mathbb E(Y_i)=4\log(1+\alpha).$$
To get this, one can write $$Y_i=2\log(|1-\sqrt{\alpha}X_i|)+2\log(|1+\sqrt{\alpha}X_i|),$$
and then use the result given here for the expectation of log of the absolute value of a Cauchy-distributed random variable.
Thus,
$$E_s= \exp \left[ s \left( \frac{1}{s} \sum_{i=1}^s Y_i \right)
\right] \to \infty \, \text{a.s.}$$
considering
$$\frac{1}{s} \sum_{i=1}^s Y_i \to 4\log(1+\alpha)>0 \, \text{a.s.}.$$
The above analysis gives the following condition:
$$\mathbb E \left( \log(|1-\alpha X^2|) \right )=\mathbb E \bigg [ \log(|1-\sqrt{\alpha}X|)+\log(|1+\sqrt{\alpha}X|) \bigg ] <0$$
for $E_s$ to a.s. converge to zero, which is much weaker than the finitness of the 4th raw moment of $X$. It is necessary and sufficient when $\mathbb E \left( \log(|1-\alpha X^2|) \right ) $ is finite and non-zero.
For the normal distribution, using the result given here and following the same method above, we have
$$\mathbb E \left( \log(|1-\alpha X^2|) \right )=\log(|\alpha|)+\frac{1}{\alpha}\times {}_2F_2\left(\left.{1,1\atop\frac{3}{2},2}\right|-\frac{1}{2\alpha}\right)-(\log\left(2\right)+\gamma),$$
which becomes negative for $$0<\alpha \le 2.421249.$$