0

Let $\{X,X_1,X_2,...\}$ be i.i.d random variables with mean zero. Assume that $E[X\log(1+|X|)] < \infty$, prove that the series $\sum_{n=1}^\infty (X_n/n)$ converges almost surely.

My attempt:

$S_n = \sum_{i=1}^n(X_n/i)$ is a martingale. $E[S_{n+1}|\mathscr{F}_n] = E[S_n + (X_{n+1}/(n+1)) | \mathscr{F}_n] = S_n + E[X_{n+1}/(n+1)] = S_n$.
Therefore, if $\langle S \rangle_{\infty} < \infty$, then we have that $\lim_{n \to \infty}S_n$ exists almost surely.

However, $\langle S \rangle_n = \sum_{i=1}^n (E[X_i^2]/i^2)$ and we are not given that this sequence have finite second moment. I'm guessing here is where I need to use $E[X\log(1+|X|)] < \infty$ but I'm not completely sure how that applies.

I thought that since $\log(1+|X|) \sim |X|$ we could say $E[X\log(1+|X|)] < \infty \Rightarrow E[X^2] < \infty$ but I don't think that's enough to assume $\sum_{i=1}^\infty (E[X_i^2])/i^2 < \infty$

dp1221
  • 707
  • 4
  • 9

2 Answers2

2

I wish I had something better, unfortunately I can only prove convergence in probability (note that $\log(1+|X|)$ is certainly not equivalent to $X$ and you cannot conclude that $\mathbb{E}[X^2] < \infty$ -- however should $X$ be $L^2$, then the expected value of $S_n^2$ is $\sum_{1 \leq k \leq n}{\frac{\mathbb{E}[X^2]}{k^2}}$ which is a bounded function of $n$ and we have a convergence of $L^2$-martingales).

Let $t \in \mathbb{R}$, define $d_n(t)=\mathbb{E}[|e^{itX/n}-1|]$, $a_n(t)=\mathbb{E}[e^{itX/n}]$ so that $|a_n(t)-1| \leq d_n(t)$. The martingale argument works when $X$ is as bounded, so we can assume that almost surely, $X=0$ or $|X| > c$ for some small $c$.

Let $F(u)=\left|\frac{e^{iu}-1-iu}{u}\right|$, it is bounded, so that $d_n(t)=\frac{|t|}{n}\mathbb{E}[|X|F(t|X|/n)]$. Write $d_n(t)-1=b_n(t)+c_n(t)$, $b_n(t)=\frac{|t|}{n}\mathbb{E}[|X|F(t|X|/n)1_{|X| \geq n}]$. As $\log(1+|X|) \sim \sum_{n \leq |X|}{\frac{1}{n}}$, we compute to find that the variable $Y=\sum_{n \leq |X|}{\frac{|X|1_{|X| \geq n}}{n}}$ has finite expectation, so that $\sum_n{b_n(t)} \leq |t|\|F\|_{\infty}\mathbb{E}[Y] < \infty$ (and by the same computation we find $\sum_{n \geq m}{b_n(t)} \leq B_m|t|$ where $B_m$ goes to zero).

But $\sum_{n \geq 1}{c_n(t)} \leq |t|\mathbb{E}\left[|X|\sum_{n > |X|}{\frac{1}{n}F(t|X|/n)}\right]$. But it's easy to see $F(x) \leq Cx$ for some constant $C$, if $x \geq 0$, so that $\sum_{n > |X|}{\frac{1}{n}F(t|X|/n)} \leq Ct^2|X|^2\sum_{n > |X|}{\frac{1}{n^2}} \leq C't^2$, so that $\sum_{n \geq 1}{|c_n|(t)} < \infty$.

Note that the computations above show that as $d_n(t) \leq D_n(|t|+t^2)$ with $D_n$ summable.

In particular, let $0 < r < 0.1$, $T=S_n-S_m$ for some $n > m$, then $P(|T| \geq r) \leq P\left(\int_{-1}^{1}{e^{itT}}\leq 2\mathrm{sinc}(r)\right) \leq P\left(\int_{-1}^1{\mathbb{E}[|e^{itT}-1|]\,dt} \geq 2(1-\mathrm{sinc}(r))\right) \leq \frac{\sum_{p \geq m}{D_p}}{2(1-\mathrm{sinc}(r))}$.

In other words, $\sup_{n \geq m}\,P(|S_n-S_m| > r) \rightarrow 0$ as $m \rightarrow \infty$.

In particular, it follows that every subsequence of $S_n$ has a subsequence that converges as. Assume now that $S_n$ doesn't converge almost surely. By martingale theory (for $L^1$), there exists a subsequence $S_{n_k}$ with $\mathbb{E}[|S_{n_k}|] \rightarrow \infty$.

Aphelli
  • 34,439
  • For the sum of independent random variables, a.s.-convergence, convergence in probability, and convergence in distribution are all equivalent. So, establishing convergence in probability is sufficient. – Sangchul Lee Aug 28 '20 at 20:16
  • @Sangchul Lee: my memories of probability are a bit remote – do you know where I can find a proof of your claim? – Aphelli Aug 28 '20 at 20:21
  • Referring to Durrett's Probability Theory and Examples (4.1Ed), Exercise 2.5.9-10 provides an outline of the proof of the claim that the sum of independent RVs converges in probability implies a.s.-convergence. This makes use of a variant of Levy-Ottaviani inequality (Exercise 2.5.9). You can also find a proof in this posting. – Sangchul Lee Aug 28 '20 at 20:29
2

Here is a proof using Kolmogorov's three-series theorem: Let $Y_n = \frac{X_n}{n} \mathbf{1}_{\{|X_n|\leq n\}}$. Then by the theorem, it suffices to show that all of the following three series converge:

$$ \sum_{n=1}^{\infty} \mathsf{P}(|X_n| > n), \qquad \sum_{n=1}^{\infty} \mathsf{E}[Y_n], \qquad \sum_{n=1}^{\infty} \mathsf{Var}[Y_n] < \infty. $$

  1. First, we have $$ \sum_{n=1}^{\infty} \mathsf{P}(|X_n| > n) = \sum_{n=1}^{\infty} \mathsf{P}(|X| > n) \leq \mathsf{E} [|X|] < \infty $$

  2. Next, using $\mathsf{E}[X] = 0$, we have $\mathsf{E}[Y_n]=-\mathsf{E}[\frac{X}{n}\mathsf{1}_{\{|X|>n\}}]$ and hence \begin{align*} \sum_{n=1}^{\infty} \left| \mathsf{E}[Y_n] \right| &\leq \sum_{n=1}^{\infty} \mathsf{E}\biggl[\frac{|X|}{n}\mathsf{1}_{\{|X|>n\}} \biggr] = \mathsf{E}\biggl[|X|\biggl( \sum_{n=1}^{\infty} \frac{1}{n}\mathsf{1}_{\{|X|>n\}} \biggr) \biggr] \\ &\leq \mathsf{E}[|X| (1 + \log(1+|X|))] < \infty. \end{align*} Here, we utilized the fact that $ \sum_{n=1}^{\infty} \frac{1}{n}\mathbf{1}_{\{ x > n\}} \leq 1 + \log(1+x) $ holds for all $x > 0$.

  3. Finally, \begin{align*} \sum_{n=1}^{\infty} \mathsf{Var}(Y_n) &\leq \sum_{n=1}^{\infty} \mathsf{E}\biggl[\frac{|X|^2}{n^2}\mathsf{1}_{\{|X|\leq n\}} \biggr] = \mathsf{E}\biggl[|X|^2\biggl( \sum_{n=1}^{\infty} \frac{1}{n^2}\mathsf{1}_{\{|X|\leq n\}} \biggr) \biggr] \\ &\leq \mathsf{E}[1+|X|] < \infty. \end{align*} Here, we utilized the fact that $ \sum_{n=1}^{\infty} \frac{1}{n^2}\mathbf{1}_{\{ x \leq n\}} \leq \frac{1}{x^2} + \frac{1}{x} $ holds for all $x > 0$.

Therefore the desired conclusion follows.

Sangchul Lee
  • 167,468