I am faced with the same problem discussed in this thread, but I would like to know if Martingale approach works.
Question: Suppose we are given a sequence of independent random variables $\{X_n\}$ and denote $S_n = \sum_{j=1}^n X_j$. Assume that $S_n$ converges in distribution. Show that $S_n$ converges almost surely.
My attempt: By independence $$ Z_n(t) = \cfrac{e^{itS_n}}{\mathbb{E}[e^{itS_n}]} $$ is a martingale for any $t$. Using uniform convergence of characteristic functions of ${S_n}$ on any bounded interval, we can show that there exists some $\delta>0$, such that for $t \in [-\delta,\delta]$, $\{|\mathbb{E}[e^{itS_n}]|:n \in \mathbb{N}\}$ is bounded away from zero. This implies that $\{Z_n(t):n \in \mathbb{N}\}$ is a $L^1$-bounded martingale. Hence, by Doob's convergence theorem we know that ${Z_n(t)}$ converges almost surely to some $Z(t)$.
My problem: I don't know how continue the derivation. If we put $\phi(t) = \lim_n \mathbb{E}[e^{itS_n}]$, we might have $e^{itS_n} \stackrel{a.s.}{\longrightarrow} Z(t)\phi(t)$ for each $t \in [-\delta,\delta]$. But is it possible to conclude that $S_n$ converges almost surely then? I am confused as I have forgotten almost everything about complex analysis, and to me it doesn't seem fine to take an arbitrary $t$ and apply logarithm here..
Any hint will be greatly appreciated!