2

The random walk on $R^2$ is defined as infinite series of $\{x_i\}_{i=0}^{\infty}$ where $x_0= (0,0)$ and each move can be one of these vectors: $ \{(-1,0) ,(0,-1) ,(1,0) ,(0,1)\} $

How can I bound the probability the walker is within the box of $ [-k,k] \times [-k,k]$ after $n$ steps.

0x90
  • 1,641

1 Answers1

2

I'm assuming those moves have equal probability. Let me write the $n$'th step as $s_n$, so $x_n = \sum_{j=1}^n s_n$. Then $E[x_n] = 0$ and $E[\|x_n\|^2] = n$. If $x_n = (X_n, Y_n)$, then $X_n$ and $Y_n$ both have mean $0$ and their covariance matrix is $\pmatrix{n/2 & 0\cr 0 & n/2\cr}$. You could try Olkin and Pratt's multivariate version of Chebyshev's inequality (see the Wikipedia article on Chebyshev's inequality), but for your purposes it may be enough to use the more elementary $$ P(|X_n| \le k\ \text{and}\ |Y_n| \le k) \ge 1 - P(|X_n| > k) - P(|Y_n| > k) \ge 1 - \frac{n}{2k^2} - \frac{n}{2k^2} = 1 - \frac{n}{k^2}$$

Robert Israel
  • 448,999
  • Thanks, will Chernoff give a better boundary? and why $E[|x_n|^2] = n$ ? which norm is it? – 0x90 Nov 16 '12 at 07:36
  • Euclidean norm. It's basically the fact that the variance of a sum of independent random variables is the sum of the variances. – Robert Israel Nov 16 '12 at 08:08