3

Let $$ X=\begin{pmatrix} 0 & & & &\\ 1 & 0 & & &\\ & 1 & 0 & &\\ & & \ddots & \ddots &\\ & & & 1 & 0 \end{pmatrix}, D=\begin{pmatrix} 0 & 1 & & &\\ & 0 & 2 & &\\ & & \ddots & \ddots &\\ & & & 0 & n\\ & & & & 0 & \end{pmatrix}\in\mathbb{R}^{(n+1)\times(n+1)} $$ and $$ \mathbf{e}_0=(1,0,\cdots,0)^\mathrm{T},\mathbf{e}_n=(0,\cdots,0,1)^\mathrm{T}\in\mathbb{R}^{n+1} $$ Prove that $$ (X-D)^n\mathbf{e}_0=\exp\left(-\frac{D^2}{2}\right)\mathbf{e}_n $$

Background: it directly comes from two representations of (probabilists') Hermite polynomial: $H_n(x)=(x-D)^n\cdot 1$ and $H_n(x)=e^{-D^2/2}\cdot x^n$, where $D$ is the differential operator. The equivalence of the two representation is not difficult but it seems techniques like generating functions are necessary at least for me. I wonder whether the identity can be proved by linear algebra only without explicitly going into Hermite polynomials.

Ziyuan
  • 2,001
  • $X$ and $D$ commute no ? – Hamza Jun 27 '16 at 23:16
  • 1
    Not quite. We should roughly have something like $[X,D] = -1$. I suspect one can use something similar to what is used here, i.e. Baker-Campbell-Haussdorf. – Winther Jun 27 '16 at 23:17
  • 1
    @Hamza $DX-XD=[I_n, 0;0, -n]$. Of course in infinite-dimensional cases $-n$ will not appear, which gives $DX-XD=I$, conforming Leibniz's product rule. – Ziyuan Jun 27 '16 at 23:20
  • 1
    @Winther Note that in finite-dimensional cases $\mathrm{Tr}([A,B])\equiv 0$. – Ziyuan Jun 27 '16 at 23:23

1 Answers1

4

We first see that $X{\bf e}_k = {\bf e}_{k+1}$ and $D{\bf e}_k = k {\bf e}_{k-1}$ so $X$ acts as multiplication by $x$ and $D \sim \frac{d}{dx}$. The commutator of $[X,D] = \text{diag}(-1,-1,-1,\ldots,-1,n)$ so $[X,D] = -1$ (and higher order correlators vanish) when acting on ${\bf e}_k$ for $k<n$. By Baker–Campbell–Hausdorff we therefore have

$$e^{t(X-D)}{\bf e}_0 = e^{tX}e^{-tD}e^{\frac{t^2}{2}[X,D]}{\bf e}_0$$

Expanding both sides in a power-series and comparing the coefficient of the $t^n$ term gives us

$$\frac{(X-D)^n}{n!}{\bf e}_0 = \sum_{k+\ell+2m = n} \frac{X^k}{k!}\frac{(-D)^\ell}{\ell!}\frac{(-1)^m}{2^mm!}{\bf e}_0$$

Since $D^\ell {\bf e}_0 = 0$ unless $\ell = 0$ the sum above is over all $m$ satisfying $0\leq 2m \leq n$ with $k = n - 2m$ which gives us

$$(X-D)^n{\bf e}_0 = \sum_{0\leq 2m \leq n} \frac{n!}{(n-2m)!}\frac{(-1)^m}{2^mm!}{\bf e}_{n-2m}$$

where we have used $X^{n-2m}{\bf e}_0 = {\bf e}_{n-2m}$ to simplify. Since the right hand side of the identity we want to prove is a function of $D$ acting on ${\bf e}_n$ we try to rewrite the expression above as a power-series in $D$ acting on ${\bf e}_n$. This can be done by using $D^{2m}{\bf e}_n = \frac{n!}{(n-2m)!}{\bf e}_{n-2m}$ to get

$$(X-D)^n{\bf e}_0 = \sum_{0\leq 2m \leq n} \frac{1}{m!}\left(-\frac{D^2}{2}\right)^m {\bf e}_n$$

Finally since $D^{2m}{\bf e}_n = 0$ for $2m>n$ we can extend the upper limit of the sum above from $n$ to infinity and use the power-series definition for $e^{-\frac{D^2}{2}}$, namely $e^{-\frac{D^2}{2}} \equiv \sum_{m=0}^\infty \frac{1}{m!}\left(-\frac{D^2}{2}\right)^m$, to get the desired result

$$(X-D)^n{\bf e}_0 = e^{-\frac{D^2}{2}}{\bf e}_n$$

Winther
  • 24,478
  • Thanks. I actually got a very similar equation (i.e., expanding $H_n(x)$ in terms of $H_{n-2k}(x)$'s) when working under the Hermite polynomial context. Looks like we anyway need the stuffs like generating functions but in different forms. Good to know the BCH. Want to wait for some time before accepting the answer though. – Ziyuan Jun 28 '16 at 01:21