0

Question :

Let's suppose $P$ is a probability, $X$ a random variable defined on $\mathbb{Z}$.

Let's suppose we have $|\sum\limits_{k\in \mathbb{Z}}P(X=k)e^{ikt}|=1$ for all $t\in \mathbb{R}$.

Prove that $X$ is almost surely constant.

My attempt :

If we consider the case where $X$ is defined on $\mathbb{N}$, we have $1=|\sum\limits_{k\in \mathbb{Z}}P(X=k)e^{ikt}|\leqslant \sum\limits_{k\in \mathbb{Z}}P(X=k) = 1$ so we have equality in the triangular inequality, but it's a infinite inequality so I'm not sure I can apply the result for the equality case for infinite sum.

Could someone help me ? (there's surely an easiest way than what I tried)

StubbornAtom
  • 17,052
math
  • 2,313
  • The conclusion should be that $\mathbb P(X=n)=1$ for some $n\in\mathbb Z$. For example, the rv that is equal to $1$ almost surely also satisfies your condition. – jlammy Jan 30 '21 at 15:40
  • I edited. Thank you. – math Jan 30 '21 at 16:17

2 Answers2

2

Write $p_m:=P(X=m)$. The correct conclusion is: There must exist some integer $m$ such that $p_m=1$ and $p_k=0$ for all $k \ne m$. Indeed, otherwise there exist distinct integers $m,k$ such that $p_m p_k>0$. Fix $t$ so that $(m-k)t$ is not a multiple of $2\pi$. In that case $|p_me^{imt}+p_k e^{ikt}|=p_m+p_k-\epsilon$ for some positive $\epsilon$ by the condition for equality in the triangle inequality [1]. Therefore for $N>|m|,|k|$ we have $$|\sum_{n =-N}^N p_ne^{int}| \le p_m+p_k-\epsilon+ \sum_{n \in [-N,N]: \, \, n \ne m,k} p_n=-\epsilon+\sum_{n \in [-N,N]} p_n .$$ Now take a limit as $N \to \infty$ to obtain a contradiction.

[1] Equality of triangle inequality in complex numbers

Yuval Peres
  • 21,955
  • Thank you. Your answer is very instructive. I upvoted but couldn't accept both your answer and @jlammy's one, so I had to choose ... – math Jan 30 '21 at 16:09
2

That sum you wrote down is called the characteristic function $\phi(t)=\mathbb E[e^{itX}]$.

Let $Y$ be an iid copy of $X$, then the characteristic function of $X-Y$ is $$\mathbb E[e^{itX}]\cdot\mathbb E[e^{-itY}]=\lvert\phi(t)\rvert^2=1.$$ The unique distribution with characteristic function identically $1$ is the rv that is almost surely $0$, so $X=Y$ almost surely. Then since $X$, $Y$ are iid, $$\mathbb P(X\leq x)=\mathbb P(X\leq x, Y\leq x)=\mathbb P(X\leq x)^2\implies\mathbb P(X\leq x)\in\{0,1\}.$$ This is clearly enough to imply that there is some $a$ in the image of $X$ (which is $\mathbb Z$ in this case) such that $\mathbb P(X=a)=1$.

jlammy
  • 9,164
  • I love your proof but I don't really get why $\mathbb E[e^{itX}]\cdot\mathbb E[e^{-itY}]=\lvert\phi(t)\rvert^2$. – math Jan 30 '21 at 16:03
  • @math $\mathbb E[e^{-itY}]=\overline{\mathbb E[e^{itY}]}=\overline{\phi(t)}$, since $Y$ also has characteristic function $\phi$. – jlammy Jan 30 '21 at 16:04
  • Thank you ! (actually I had to prove it in a problem concerning characteristic functions) – math Jan 30 '21 at 16:08