I am checking the proof of the construction of a brownian motion in $[0,\pi]$. We show that
\begin{gather*}
t \mapsto B^m_t = \frac{t}{\sqrt{\pi}}X_0 + \sqrt{\frac{2}{\pi}}\sum_{n=1}^{2^m-1}X_n \frac{\sin(nt)}{n},
\end{gather*}
where $(X_n,n\in\mathbb{N})$ is a sequence of i.i.d. random variables $N(0,1)$, converges uniformly on $[0,\pi]$ with probability $1$ to a continuous function, denoted $t \mapsto B_t$, that is a standard brownian motion.
When we have proved this, it remains to show that
\begin{gather*}
B_t = \frac{t}{\sqrt{\pi}}X_0 + \sqrt{\frac{2}{\pi}}\sum_{n=1}^{+\infty}X_n \frac{\sin(nt)}{n}
\end{gather*}
is indeed a brownian motion. A sentence in this proof says that "For $s,t \in \mathbb{R}_+$, $B_{s+t} - B_s$ is gaussian as a sum of independent gaussian random variables". Why is it true for series ?
Asked
Active
Viewed 236 times
4

MCrassus
- 343
1 Answers
2
Define for fixed $s,t\in\mathbb R_+$ the random variable $$Y_N:=\frac{t}{\sqrt{\pi}}X_0+\sqrt{\frac 2{\pi}}\sum_{n=1}^NX_n\frac{\sin(nt)-\sin(ns)}n.$$ For a fixed $N$, the random variable $Y_N$ has Gaussian distribution has a linear combination of the Gaussian vector $(X_0,\dots,X_N)$.
By the results mentioned in the OP, the sequence $(Y_N)_{N\geqslant 1}$ is converges in distribution (actually almost surely), and the limit in distribution of a Gaussian distribution is Gaussian.

Davide Giraudo
- 172,925