5

There are $n$ switches in a dark room controlling a single light; all the switches must be on for the light to be on. Initially all the switches are off. A drunkard left in the room wakes up from a hangover and repeatedly picks a switch uniformly at random and flips it. What is the expected number of flips the drunkard takes to turn on the light?

I answered the $n=4$ case here, but the weird result of $\frac{64}3$ enticed me to generalise the problem, for which I have rewritten the setting. If $E_k$ is the expected number of flips needed with $k$ switches on, the $E_k$ satisfy the following tridiagonal system: $$\begin{bmatrix}1&-1&0&\dots&0&0\\-\frac1n&1&\frac1n-1&\dots&0&0\\0&-\frac2n&1&\dots&0&0\\\vdots&\vdots&\vdots&\ddots&\vdots&\vdots\\0&0&0&\dots&1&-\frac2n\\0&0&0&\dots&\frac1n-1&1\end{bmatrix}\begin{bmatrix}E_0\\E_1\\E_2\\\vdots\\E_{n-2}\\E_{n-1}\end{bmatrix}=\begin{bmatrix}1\\1\\1\\\vdots\\1\\1\end{bmatrix}$$ The $(i,j)$ entry of the square matrix is $1$ if $i=j$, $-\frac jn$ if $i=j+1$, $\frac{i-1}n-1$ if $i=j-1$ and $0$ otherwise. The answer to the problem proper is then $E_0$, the first few values of which are (starting from $n=1$) $$1,4,10,\frac{64}3,\frac{128}3,\frac{416}5,\frac{2416}{15},\frac{32768}{105},\frac{21248}{35},\frac{74752}{63}$$ After nosing around and actually solving the tridiagonal system using a dedicated algorithm I found a formula for $E_0$ at any $n$: $$E_0(n)=2^{n-1}F(n-1)\text{ where }F(n)=\sum_{k=0}^n\frac1{\binom nk}\tag1$$ Now $F$ has been discussed at length on this site, including here, here and here, and is A048625/A048626 in the OEIS. Hence the generating function for $E_0$ may be obtained: $$\sum_{n=0}^\infty E_0(n)z^n=z\left(\frac{\log(1-2z)}{2(z-1)}\right)'$$ The appearance of $F(n)$ was a pleasant surprise for me, and now I have no proof because this was all numerical experimentation.

How can $(1)$ be proved?

I found $(1)$ by tracing the variables in the tridiagonal matrix algorithm applied to the above square matrix turned upside-down. I found that at the end (using the notation on Wikipedia) $d_n=2^{n-1}$ and $b_n=\frac1{F(n-1)}$, but trying to expand everything symbolically only produces a huge tree of fractions.

Parcly Taxel
  • 103,344

1 Answers1

3

This interesting problem is equivalent to this one: compute the expected length of a random walk from one vertex of an $n$-dimensional hypercube to the diagonally opposite vertex.

And this is indeed mentioned in https://oeis.org/A003149

2^n*A003149(n)/n! is the expected length of a random walk from one vertex of an (n+1)-dimensional hypercube to the diagonally opposite vertex (a walk which may include one or more passes through the starting point)

Here a solution (quite complicated) is given.

I prefer to copy from here and its answer :


Three lemmas:

Lemma 1: The average return time for any vertex is $2^n$

This is easy to see because all vertexes are equivalent.

Lemma 2: For any starting vertex, the average time for returning to it or reach its opposite is $2^{n-1}$

This is similar to the above, by grouping each vertex with its opposite.

Lemma 3: For any vertex, the probability that it reaches the opposite before returning to the original is $1/S$ where $S=\sum_k \binom{n-1}{k}$

Let consider each vertex as a $n-$binary tuple, and let the weight be the number of ones.

Let $p(m)$ be the probability that a random walk starting at a vertex with weight $m$ will arrive at $(1,1,1...1)$ before it reaches $(0,0 \cdots,0)$. Clearly $p(0)=0$ and $p(n)=1$. Furthermore $$p(k)=\frac{k}{n} p(k-1)+ \frac{n-k}{n}p(k+1) \tag 1$$ for $0<k<n$.

Let $q(k)=p(k+1)-p(k)$. Then the above equation can be rewritten as $$\frac{k}{n} q(k-1)=\frac{n-k}{n}q(k)\tag 2 $$ or $$q(k)=q(k-1)\frac{k}{n} $$.

So $q(1)=q(0)/(n-1)$, $q(2)=q(0) 2/((n-1)(n-2))$ etc. Therefore

$$q(k)=\frac{1}{\binom{n-1}{k}} q(0) \tag 3$$

Let $$S(n-1)=\sum_{k=0}^{n-1} \frac{1}{\binom{n-1}{k}} \tag 4$$

We have $1=p(n)-p(0)=q(0)+q(1)+\cdots +q(n-1)=q(0) S$. Therefore $q(0)=1/S$.

Now if a random walk starts at $(0,0,...,0)$ after one step it will be at a vertex of weight 1. So the probability it will reach $(1,1,...,1)$ before returning to $(0,0,...,0)$ is $p(1)=p(1)-0=q(0)=1/S$.


Now, let $T$ be the expected number of steps to reach $(1,1,... 1)$ from $(0,0 ...0)$.

By lemma 2 it takes an average of $2^{n-1}$ steps to either reach the opposite corner or return to the starting point. By lemma 3, the walk will return to the starting point a fraction $\frac{S-1}{S}$ of the time. So

$$T=2^{n-1}+\frac{S-1}{S}T \implies T=2^{n-1} S $$

where $S$ is defined in $(4)$

leonbloy
  • 63,430