The claim in question is a corollary of a standard SLLN for martingale difference sequences (MDS).
SLLN for MDS
The statement of SLLN for MDS is as follows.
Let $N_t$ be a martingale difference sequence (MDS) such that $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$, then
$$
\frac{1}{n} \sum_{t=1}^n N_t \rightarrow 0 \;\;a.s.
$$
(In this case, the martingale difference sequence $N_t$ is given by differencing the martingale $X_t$: $N_t = X_t - X_{t-1}$.
Then summation by parts gives
\begin{align*}
\sum_{t=1}^n \frac{E[N_t^2]}{t^2} &= \sum_{t=1}^n \frac{E[X_t^2] - E[X_{t-1}^2]}{t^2} \\
&= \frac{E[X_n^2]}{n^2} - \sum_{t = 1}^{n} E[X_{t-1}^2] \left( \frac{1}{t^2} - \frac{1}{(t-1)^2} \right).
\end{align*}
The assumption that $E[X_{t}^2] = O(t)$ implies that
$$
E[X_{t-1}^2] ( \frac{1}{(t-1)^2} - \frac{1}{t^2} ) = O(\frac{1}{t^2}).
$$
Therefore $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$.
)
In turn, the SLLN for MDS can be shown via two arguments. Both are standard devices for results of this type, one via the martingale convergence theorem and another via Kolmogorov's martingale maximal inequality.
Via Martingale Convergence Theorem
(The previous answer is a variation of this argument.)
If $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$, the martingale $Y_n = \sum\limits_{t = 1}^n \frac{N_t}{t}$, $n \geq 1$, is bounded in $L^2$, therefore converges almost surely (and in $L^2$).
Therefore, by Kronecker's lemma,
$$
\frac{1}{n}\sum_{t = 1}^n N_t \stackrel{a.s.}{\rightarrow} 0
$$
as $n \rightarrow \infty$.
Via Maximal Inequality
Consider again the $L^2$-martingale $Y_n = \sum\limits_{t = 1}^n \frac{X_t}{t}$, $n \geq 1$.
Let $\sigma^2_t = \frac{E[ X_t^2 ]}{t^2}$.
By the maximal inequality, for all $n > 0$ and for all $\epsilon > 0$,
$$
P( \sup_{m \geq n} | S_m - S_n | \geq \epsilon ) \leq \frac{K}{\epsilon^2} \sum_{t \geq n} \sigma^2_t
$$
for some constant $K$ independent of $n$.
Therefore
$$
P( \inf_n \sup_{m \geq n} | S_m - S_n | \geq \epsilon ) = 0
$$
for all $\epsilon > 0$. In other words, the sequence $S_n$, $n \geq 1$, is Cauchy, therefore converges, with probability $1$.
Again by Kronecker's lemma,
$$
\frac{1}{n}\sum_{t = 1}^n N_t
$$
converges to zero as $n \rightarrow \infty$ with probability $1$.