2

What would be the idea behind the so called bicommutant in connection to multiplicity theory and Neumann algebras (Conway's exposition in "A course in functional...")?

To me it just looks like a rather obscure construction to obtain something nice. But I find it hard to believe that there wasn't any intuition involved when inventing thing concept.

user123124
  • 1,835
  • Bicommutants appear all over the place and they are an extremely natural thing to consider. A natural situation, for example, is that of double centralizers in the context of central simple algebras – Mariano Suárez-Álvarez May 14 '17 at 03:47
  • @MarianoSuárez-Álvarez well yea its an abstact algebraic thing, but I dont see how this is realated to spectral thery of operators. – user123124 May 14 '17 at 06:35
  • Well, someone who was working on operators was familiar with the algebraic theory and voilà. – Mariano Suárez-Álvarez May 14 '17 at 06:36
  • @MarianoSuárez-Álvarez it is the voila I am interseted in! – user123124 May 14 '17 at 06:37
  • 1
    @user84647 Just to be clear, are you asking about intuition regarding the Bicommutant Theorem, or is your question more specific? – Jonas Dahlbæk May 14 '17 at 20:40
  • 2
    The voila is in all likelyhood: Someone knew that doing X worked in context A, and wanted to obtain a similar conclusion in context B, so he tried X' and it worked. That's how 99.76% of things work. It's quite unclear what else you are expecting, really. – Mariano Suárez-Álvarez May 14 '17 at 20:44
  • @MarianoSuárez-Álvarez right, but I wounder what the "X" would be in this case. Often there is a problem that one would solve and one thinks about it in a certain way and then voila!. My question is what was the problem and how did they think about it – user123124 May 15 '17 at 04:56
  • @JonasDahlbæk it is on the definition. They commutant I can understand since it should contain projections for say having reducing suspaces. But I dont know what to think when I consider the bicommutnat, it just blank! – user123124 May 15 '17 at 05:08
  • The bicommutant of an algebra is, more or less, the largest algebra that is reduced by the same subspaces as the original one. That is essentially its definition and it is an immensely natural thing to consider. The von Neumann theorem thus says that something that is reduced by the same things that reduce the original algebra are in fact in the (weak or strong) closure of the original algebra and can therefore be approximated by things in it. As I wrote, it is quite unclear what you are after... – Mariano Suárez-Álvarez May 15 '17 at 08:00
  • @MarianoSuárez-Álvarez I think the points you bring up would make for a very useful answer. – Jonas Dahlbæk May 15 '17 at 22:00
  • But that is precisely the definition of the bicommutant! – Mariano Suárez-Álvarez May 15 '17 at 23:48
  • 1
    @MarianoSuárez-Álvarez I disagree, but maybe it is just due to my lack of understanding of the more algebraic notions! In my mind, the definition of the bicommutant has nothing to do with reducing subspaces. However, the intuition behind the bicommutant theorem, to me, becomes clear when you think in terms of reducing subspaces. If all reducing subspaces of a normal operator $T$ are also reducing subspaces of $S$, then it follows that $S=g(T)$ for some function $g$, as one might have intuited. That is, $S$ lies in the von Neumann algebra generated by $T$. – Jonas Dahlbæk May 16 '17 at 00:33
  • 1
    @JonasDahlbæk I agree, the definition is very abstract. I will however, meditate on what Mariano wrote and see if I can follow his idea. – user123124 May 18 '17 at 09:23
  • Dear @Jeaj, could you please explain the aforementioned intuition for the commutant of an operator? In finite dimensional linear algebra, the commutant of a cyclic operator consists of the polynomials in it, and a direct sum decomposition of a vector space consists of invariant subspaces iff the projections are in the commutant. I don't know how to merge these results into a single intuition. – Arrow Apr 19 '19 at 21:25
  • @Arrow I was working on this 2 year ago and have not touched it since. I am by no means an expert on this and it would take me alot of time to get into it again. Try asking some of the user who commented and answered the question. – user123124 Apr 20 '19 at 08:34

1 Answers1

4

Let $\mathcal{H}$ be a Hilbert space and let $M$ be an operator on $\mathcal{H}$. Write $\{M\}'$ for the commutant of $M$ and $\{M\}''$ for the bicommutant of $M$. As observed by MarianoSuárez-Álvarez in the comments, an operator $B$ is in the bicommutant of $M$ if and only if every reducing subspace of $M$ is also a reducing subspace of $B$. Intuitively, $B$ has a 'simpler' structure than $M$, since it has more reducing subspaces. In fact, if $M$ is normal, then the von Neumann algebra generated by $M$ (and therefore also the bicommutant of $M$) is given by $\{f(M) \, \vert \, f\in L^\infty(\sigma(M))\}$, i.e. the bicommutant consists of all functions of $M$. In probability theory, the analogous result is that a random variable $Y$ is measurable with respect to the $\sigma$-algebra generated by another random variable $X$ if and only if there is a measurable function $f$ such that $Y=f(X)$.

I think the situation is most clear when one considers simply a normal matrix $M$ in $\mathcal{H}=\mathbb{C}^n$. In that case, we have the

Spectral Theorem: Let $M$ be a normal operator in $\mathcal{H}=\mathbb{C}^n$. Then $\mathcal H$ has an orthogonal decomposition $\mathcal{H}=\mathcal{H}_1\oplus\ldots\oplus\mathcal{H}_N$ such that $M=\mathrm{diag}(\lambda_1 I_1,\ldots,\lambda_N I_N)$, where, for $1\leqslant j \leqslant N$, $I_j$ denotes the identity in $\mathcal{H}_j$ and $\lambda_1,\ldots,\lambda_N$ denote the distinct eigenvalues of $M$.

It is therefore straightforward to compute both the commutant and the bicommutant.

Proposition: We have $C\in \{M\}'$ if and only if $C=\mathrm{diag}(C_1,\ldots,C_N)$, where, for $1\leqslant j \leqslant N$, $C_j$ is a linear operator in $\mathcal{H}_j$.

Proof: If $CM=MC$, then we have, for all $j,k\in\{1,\ldots,N\}$, $$ \lambda_k C_{jk} = \lambda_j C_{jk}. $$ It follows that $C_{jk}=0$ unless $j=k$.

Proposition: We have $B\in \{M\}''$ if and only if $B=\mathrm{diag}(\mu_1 I_1,\ldots, \mu_N I_N)$, where, for $1\leqslant j \leqslant N$, $\mu_j\in\mathbb{C}$.

Proof: Since $M\in \{M\}'$, we have $B\in\{M\}'$ and therefore $B=\mathrm{diag}(B_1,\ldots,B_N)$. Furthermore, if $BC=CB$ for $C\in\{M\}'$, then we have, for all $1\leqslant j \leqslant N$, $$ B_{j}C_{j} = C_j B_{j}. $$ But if $B_j$ commutes with every linear operator $C_j$ in $\mathcal{H}_j$, then $B_j=\mu_j I_j$ for some $\mu_j\in\mathbb{C}$, see the question A linear operator commuting with all such operators is a scalar multiple of the identity..

Corollary: $B\in\{M\}''$ if and only if there is a polynomial $p$ such that $B=p(M)$.

Proof: Let $p$ be the Lagrange polynomial such that $p(\lambda_j)=\mu_j$.

  • Why does the bicommutant consist of operators that have reducing subspaces of $M$ as their own? I don't see why this is equivalent to the definition. – Arrow Apr 20 '19 at 06:24
  • @Arrow It's been a while, so feel free to correct me if I'm wrong. I believe it goes like this: Recall that $P$ is a projection onto a reducing subspace $V$ of $M$ if and only if $PM=MP$. Suppose then that $A\in{M}''$ and $P$ is a projection to a reducing subspace $V$ of $M$. Since $PM=MP$, we have $P\in{M}'$. It follows that $AP=PA$, i.e. $V$ is reducing for $A$. Conversely, suppose $AP=PA$ whenever $MP=PM$, and let $S\in{M}'$. $S$ is the limit in the strong operator topology of finite linear combinations of projections in ${M}'$. It follows after taking limits that $AS=SA$. – Jonas Dahlbæk May 05 '19 at 20:35
  • Dear Jonas, I don't understand why the assertion in your penultimate sentence holds. Why is each operator a limit of a finite combination of projections? I don't think this holds in the finite dimensional complex case (Jordan block). Or do you assume the operator is diagonalizable? Also, where can I find a reference for your assertion in the penultimate sentence of this comment? – Arrow May 11 '19 at 17:35
  • @Arrow You first decompose the operator into its self-adjoint and anti self-adjoint parts, $S = (S + S) + (S - S)$, then you apply spectral calculus to the two parts separately. – Jonas Dahlbæk May 21 '19 at 07:24
  • @Arrow, the Von Neumann algebra generated by $T$ is isomorphic to $L^\infty(\sigma(T))$, and the isomorphism is given by $f\mapsto f(T)$. Thus, if $S$ lies in the Von Neumann algebra generated by $T$, theres is $g$ such that $S=g(T)$. – Jonas Dahlbæk May 21 '19 at 07:30