It is a result due to Kolmogorv and Khintchine that if $X_{n}$ is a sequence of independent random variables such that $\sum_{n}Var(X_{n})<\infty$ then $\sum_{k=1}^{\infty}X_{k}-E(X_{k})$ converges almost surely to some random variable which is finite almost surely. See this question here for more.
A short sketch of the proof is given below.
Well, the reason for the above is that if we WLOG assume that
$E(X_{k})=0$ (else consider $X_{k}-E(X_{k})$) for each $k$, then
$||S_{n}-S_{m}||_{L^{2}}=\sum_{k=n}^{m}Var(X_{k})\leq\sum_{k=n}^{\infty}Var(X_{k})
> \xrightarrow{n\to\infty}0$ .Which means that $S_{n}$ is $L^{2}$
Cauchy. Now this is due to a theorem due to Levy that $S_{n}$ being
Cauchy in probability implies it is Cauchy almost surely. (where
$S_{n}$ is the sequence of partial sums of independent random
variables).
Let $\displaystyle S_{n}=\sum_{k=1}^{n}X_{k}$
This idea is also similar to what Sangchul Lee's answer suggests. Basically, since $E(X_{k})=0$, then what we have shown implies $S_{n}$ is an $L^{2}$ Cauchy martingale which implies convergence to a random variable $X_{\infty}\in L^{1}$ and hence is finite almost surely.
Anyways, getting back to your problem,
First see that $\frac{1}{n}\sum_{i = 1} ^n g_i^2\xrightarrow{a.s.}1 $ due to SLLN.
Next, your condition directly implies that $\sum_{i<j}Var( a_{ij}g_ig_j)=\sum_{i<j}a_{ij}^{2}<\infty$.
Hence $\sum_{1 \leq i < j \leq n} a_{ij}g_ig_j$ converges to an almost surely finite random variable due to the result I stated above.
Hence $\frac{1}{\sqrt{n}}\sum_{1 \leq i < j \leq n} a_{ij}g_ig_j\xrightarrow{a.s.}0$ easily follows.
For a more detailed reference to the Kolmogorov-Khintchine Criterion see Sidney Resnick a Probability Path Theorem 7.3.3