10

I am working through the exercises in "Lie Groups, Lie Algebras, and Representations" - Hall and can't complete exercise 11 of chapter 3. My aim was to demonstrate that there does not exist a vector space isomorphism $A$ between the two spaces that also preserves the commutator.
$$[AX, AY] = A[X, Y]$$ To this end I computed the following commutation relations on bases for two spaces.

For the $\mathfrak{su}(2)$ basis matrices $e_1, e_2, e_3$ it holds that $$[e_1, e_2] = 2e_3 \,\,\,\,\,\, [e_1, e_3] = -2e_2 \,\,\,\,\,\, [e_2, e_3] = 2e_1$$

For the $\mathfrak{sl}(2, \mathbb{R})$ basis matrices $f_1, f_2, f_3$ it holds that $$[f_1, f_2] = 2f_2 \,\,\,\,\,\, [f_1, f_3] = -2f_3 \,\,\,\,\,\, [f_2, f_3] = f_1$$

It is clear that for the linear bijection $(e_1, e_2, e_3) \mapsto (f_1, f_2, f_3)$ would not preserve the relationships, nor would a permutation of the target matrices. However, I need to show no invertible matrix satisfies $$[AX, AY] = A[X, Y]$$ So from there I began to derive equations for the elements of $A$. They are ugly expressions in terms of the sub-determinants of the $A$ matrix, and given them I can't think of a way to conclude $A$ cannot exist. Is there an easier way to finish the proof than to derive the equations for $A$?

Note: I have looked up solutions for this problem and the only technique I see hinted at is to consider Killing forms (which have not yet been covered in this book).

muaddib
  • 8,267
  • 1
    sl has an element whose ad is diagonakizablr. Does su have one too? – Mariano Suárez-Álvarez Aug 08 '15 at 01:26
  • @MarianoSuárez-Alvarez I've fixed the above commutivity relations for $su(2)$ and from there determined: one of the basis elements of $sl(2)$ has a diagonalizable ad. All three of the basis elements of $su(2)$ have diagonalizable ad if we allow imaginary eigenvalues, and none of them are otherwise. I'm not sure how to use this observation though. – muaddib Aug 08 '15 at 15:37
  • An isomorphism $f:su(2)\to su(3)$ has to map a diagonalizable element to a diagonalizable element. – Mariano Suárez-Álvarez Aug 08 '15 at 16:01
  • @MarianoSuárez-Alvarez - Thanks for the proof technique. I ended up using that if there was an isomorphism from $\mathfrak{su}(2)$ to $\mathfrak{sl}(2, \mathbb{R})$ then diagonalizability in the adjoint representation is preserved. You referenced $\mathfrak{su}(3)$ above so I am unsure, is that the proposition you were hinting at? – muaddib Aug 08 '15 at 21:01
  • 2
    Generally speaking, it's hard to explicitly write down all of the conditions needed on a map for it to be an isomorphism and check whether or not they can all be satisfied. What the suggestions all have in common is that they instead proceed by showing that one of the Lie algebras has a property that the other one doesn't. Simple examples of this are having a particular dimension or being simple or semisimple, etc. – Qiaochu Yuan Aug 08 '15 at 22:50

6 Answers6

6

Your approach works without problems, if you write the condition $[Ax,Ay]=A[x,y]$ for all $x,y$ in terms of the $9$ coefficients of the matrix $A$. The polynomial equations in these $9$ unknowns over $\mathbb{R}$ quickly yield $\det(A)=0$, a contradiction.

Another elementary argument is the following. $\mathfrak{sl}(2,\mathbb{R})$ has a $2$-dimensional subalgebra, e.g., $\mathfrak{a}=\langle f_1,f_2\rangle$, but $\mathfrak{su}(2)$ has no $2$-dimensional subalgebra. Hence they cannot be isomorphic.

Dietrich Burde
  • 130,978
  • (+1) Very helpful. I verified the second proof technique (much cleaner, and certainly uses concepts that have been covered). I'll try to finish using the original technique to see how it quickly leads to $det(A) = 0$. – muaddib Aug 08 '15 at 18:45
  • Could you please give an explicit explanation of your first argument? After mess computations, I don’t see how this is true. The computation given in the answer below is obviously not correct. – GK1202 Feb 07 '21 at 11:02
  • @GK1202 I just wrote down the equations and did some easy substitutions of the variables (the affine part, i.e., several variables (of the matrix $A$) can be expressed by the other variables, so "linear" equations, if you want). Then I obtain some easy polynomial equations and $\det(A)=0$, a contradiction. – Dietrich Burde Feb 07 '21 at 11:23
  • @Dietrich Burde I agree, that's how I thought first to tackle this problem, but it's not obvious to me how "easy substitution of variables" works. Since the computation below is not right, you may want to write a correct one for the completeness of answers. – GK1202 Feb 07 '21 at 11:30
  • @GK1202 I could append the file with the computations, but I think this is something which everyone should do themselves. This is the point of my answer. Either you try yourself a computation, or you accept an answer by theory, using different invariants of these two Lie algebras. – Dietrich Burde Feb 07 '21 at 14:01
  • @Dietrich Burde If people really know how to calculate it, then why a wrong computation like below has that many votes? Definitely I tried myself, the result said det($A$)>0 always, which is ridiculous, but I don’t see any wrong in my calculations. That’s the reason I come to ask for help. – GK1202 Feb 07 '21 at 20:20
  • @Dietrich Burde Hello, professor, I worked for a whole day to check my calculations again and again, but still having no clues what’s wrong. I post a new thread on this and showed my work, if you have spare time and would like to help, then please check https://math.stackexchange.com/questions/4016840/proof-verification-su2-is-not-isomorphic-to-sl2-r?noredirect=1#comment8292710_4016840 – GK1202 Feb 08 '21 at 07:39
5

This is a Q&A style answer not meant to be the final answer to the question. It fleshes out one of the techniques suggested by Dietrich Burde for future readers.

Another elementary argument is the following. $\mathfrak{sl}(2,\mathbb{R})$ has a $2$-dimensional subalgebra, e.g., $\mathfrak{a}=\langle f_1,f_2\rangle$, but $\mathfrak{su}(2)$ has no $2$-dimensional subalgebra. Hence they cannot be isomorphic.


$\mathfrak{sl}(2, \mathbb{R})$ has a two dimensional subspace.

Consider matrices of the form $\alpha_1 f_1 + \alpha_2 f_2$. Clearly this is a subspace of $\mathfrak{sl}(2)$. We need to show the commutation operation is closed in this subspace: $$[\alpha_1 f_1 + \alpha_2 f_2, \beta_1 f_1 + \beta_2 f_2] = 2(\alpha_1\beta_2 - \alpha_2\beta_1)f_2$$

$\mathfrak{su}(2)$ does not have a two dimensional subspace.

Consider a two dimensional subspace with basis $g_1, g_2$. Then $$[\alpha_1 g_1 + \alpha_2 g_2, \beta_1 g_1 + \beta_2 g_2] = (\alpha_1\beta_2 - \alpha_2\beta_1)[g_1, g_2]$$ We must show that $g_1, g_2$ cannot be chosen such $[g_1, g_2]$ is in the span of $g_1, g_2$. To this end let $g_1 = \sum_i a_i e_i, g_2 = \sum b_i e_i$. It can be shown through direct calculation that $$[g_1, g_2] = \begin{vmatrix} 2 e_1 & a_1 & b_1 \\ 2 e_2 & a_2 & b_2 \\ 2 e_3 & a_3 & b_3 \notag \end{vmatrix}$$ In other words, the commutator of $g_1$ and $g_2$ is twice their cross product. Since the cross product is perpendicular to $g_1, g_2$ we are done.

muaddib
  • 8,267
4

(Warning: See comments below this post - this answer is incorrect currently. A completed answer has been posted at math.stackexchange.com/a/4032652/96384)

This is a Q&A style answer not meant to be the final answer to the question. It completes the original technique for future readers. Thanks to Dietrich Burde for the motivation to continue with it.

As above, suppose $A$ is an isomorphism from $\mathfrak{su}(2) \to \mathfrak{sl}(2, \mathbb{R})$. Then $$[AX, AY] = A[X, Y]$$

Let $A_i$ denote the column vectors of $A$. Then $$Ae_i = \sum_j A_{ij} f_j$$ We use $[Ae_1, Ae_2] = A[e_1, e_2]$ to obtain $$2\begin{vmatrix} A_{11} & A_{21} \\ A_{12} & A_{22} \end{vmatrix}f_2 + 2\begin{vmatrix} A_{11} & A_{21} \\ A_{13} & A_{23} \end{vmatrix}(-f_3) + \begin{vmatrix} A_{12} & A_{22} \\ A_{13} & A_{23} \end{vmatrix}f_1 = 2 (A_{31}f_1 + A_{32}f_2 + A_{33}f_3)$$ Combining these three implied equations with the cofactor expansion of the determinant: $$\begin{vmatrix} A_{11} & A_{21} & A_{31} \\ A_{12} & A_{22} & A_{32} \\ A_{13} & A_{23} & A_{33} \end{vmatrix} = A_{31}\begin{vmatrix} A_{12} & A_{22} \\ A_{13} & A_{23} \end{vmatrix} - A_{32}\begin{vmatrix} A_{11} & A_{21} \\ A_{13} & A_{23} \end{vmatrix} + A_{33}\begin{vmatrix} A_{11} & A_{21} \\ A_{12} & A_{22} \end{vmatrix} $$ we obtain: $$\det(A) = 2 A_{31}^2 + A_{32}A_{33} + A_{33}A_{32}$$ Using the other two commutivity relations we get: $$\det(A) = 2 A_{11}^2 + 2 A_{12} A_{13}$$ $$\det(A) = 2 A_{21}^2 + 2 A_{22} A_{23}$$

muaddib
  • 8,267
  • 4
    Very nice. This shows the claim in the most direct way. I think it is important to see this way, although there are other, more conceptional ways. Over fields of prime characteristics, many "clever" arguments quickly get wrong, but this approach always works (and can be done by a computer). – Dietrich Burde Aug 09 '15 at 10:47
  • This computation seems wrong to me. The cofactors expansion is obviously not correct. – GK1202 Feb 07 '21 at 10:47
  • What is the obvious mistake then? – muaddib Feb 08 '21 at 13:44
  • 1
    @GK1202 is right, the second and third term in the cofactor expansion here is wrong. (I think there's also one wrong 2 in the line before but that's insubstantial.) What you would actually get there is $det(A) = 2A_{31}^2+2A_{32}A_{33}$. Besides, nowhere do you use anything about the real numbers, but there is an isomorphism e.g. over the complex numbers, so this approach is at least incomplete. – Torsten Schoeneberg Feb 08 '21 at 20:32
  • 1
    I have fixed the wrong $2$, but the greater problem (see previous comment) persists, so I removed my upvote. Happy to vote up again when the proof is fixed. – Torsten Schoeneberg Feb 09 '21 at 17:40
  • Excellent - thanks for the thoughts. – muaddib Feb 09 '21 at 21:07
  • 1
    I have completed a proof with this approach in https://math.stackexchange.com/a/4032652/96384. It needs arithmetic of the base field in a not entirely trivial way. – Torsten Schoeneberg Feb 20 '21 at 17:10
4

Just to add another method to this nice collection of answers:

One sees easily that $\mathfrak{sl}(2, \mathbb R)$ has many ad-nilpotent elements besides $0$ (e.g. in your presentation $f_1$ and $f_2$), whereas one can show that $\mathfrak{su}_2$ has no ad-nilpotent element $\neq 0$.

To see this latter fact, one could compute the eigenvalues of $ad$ of a general element, or use the following shortcut: We know that $\mathfrak{su}_2$ has a standard representation on $\mathbb C^2$ which identifies it with the matrices

$$\pmatrix{ai&b+ci\\-b+ci&-ai}$$

($a,b,c \in \mathbb R$); now if some element were $ad$-nilpotent, then it must act nilpotently on any representation i.e. the above matrix would also need to be nilpotent, in particular have vanishing determinant. But its determinant is $a^2+b^2+c^2$ which is $\neq 0$ unless $a=b=c=0$.


The nice thing about this method is that it is a special case of a general fact I have found useful:

Among the semisimple real Lie algebras, the ones of the compact forms (like $\mathfrak{su}_n$) are precisely the ones which have no non-trivial nilpotent elements.

1

This is a Q&A style answer not meant to be the final answer to the question. It fleshes out one of the techniques suggested by Mariano Suárez-Alvarez for future readers.

An isomorphism $f:\mathfrak{su}(2) \to \mathfrak{su}(3)$ has to map a diagonalizable element to a diagonalizable element.

It is isn't quite the same technique, but inspired by it. Instead I will use that if an isomorphism existed between $\mathfrak{su}(2)$ and $\mathfrak{sl}(2, \mathbb{R})$ then the induced homomorphism on their adjoint representations would have to preserve diagonalizability of matrices. This leads to a contradiction.

The following proposition is inspired by Lie algebra homomorphisms preserve Jordan form:

Suppose the Lie algebras $\mathfrak{g}, \mathfrak{h}$ are isomorphic. Denote the isomorphism as $\phi : \mathfrak{g} \to \mathfrak{h}$. Then for all diagonalizable $ad_X \in ad_\mathfrak{g}$, $\phi^*(ad_X) \in ad_\mathfrak{h}$ is diagonalizable (where $\phi^*$ is the induced homomorphism between the adjoint representations). In particular, if $\lambda_i$, $Y_i$ is an eigenvalue, eigenvector pair of $ad_X$, then $\lambda_i$, $\phi(Y_i)$ is an eigenvalue, eigenvalues pair of $ad_{\phi(X)}$.

Suppose that $ad_X$ is diagonalizable with eigenvalues $\lambda_i$ and eigenvectors $Y_i$. Then $$ad_X(Y_i) = \lambda_i Y_i$$ We want to show that $\phi(Y_i)$ is an eigenvector of $\phi^*(ad_X)$.

\begin{eqnarray*} \phi^*(ad_X)(\phi(Y_i)) &=& ad_{\phi(X)}(\phi(Y_i)) \\ &=& [\phi(X), \phi(Y_i)] \\ &=& \phi([X, Y_i]) \\ &=& \phi(ad_X(Y_i)) \\ &=& \lambda_i\phi(Y_i) \\ \end{eqnarray*}

Now using the commutivity relations stated in the problem we can calculate the adjoint representation of $\mathfrak{su}(2)$: $$ ad_{e_1} = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & -2 \\ 0 & 2 & 0 \end{bmatrix} \,\,\,\,\, ad_{e_2} = \begin{bmatrix} 0 & 0 & 2 \\ 0 & 0 & 0 \\ -2 & 0 & 0 \end{bmatrix}\,\,\,\,\, ad_{e_3} = \begin{bmatrix} 0 & -2 & 0 \\ 2 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$$

For $\mathfrak{sl}(2, \mathbb{R})$ we find: $$ ad_{f_1} = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & -2 \end{bmatrix} \,\,\,\,\, ad_{f_2} = \begin{bmatrix} 0 & 0 & 1 \\ -2 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}\,\,\,\,\, ad_{f_3} = \begin{bmatrix} 0 & -1 & 0 \\ 0 & 0 & 0 \\ 2 & 0 & 0 \end{bmatrix}$$

Suppose $\phi$ is an isomorphism between $\mathfrak{sl}(2, \mathbb{R})$ and $\mathfrak(su)(2)$ and that $$\phi(f_1) = a_1 e_1 + a_2 e_2 + a_3 e_3$$ Now any linear combination of the matrices $ad_{e_i}$ is skew-symmetric which means that it has imaginary eigenvalues. On the other hand the matrix $ad_{f_1}$ has eigenvalues $0, -2, 2$. Consider the eigenvalue, eigenvector pair $-2, v$ of $f_1$. There is no way that $\phi(v)$ can be an eigenvector of $\phi(f_1)$ with eigenvalue $-2$, so we have a contradiction.

muaddib
  • 8,267
1

My proposed proof use the fact that Hermitian matrix can be diagonalized through unitary matrix to show that there's no isomorphism that preserve the Lie bracket:

$X,Y,Z \in sl(2,\Bbb R)$

where the usual brackets are:

$[X,Y] = Z\,,\,[X,Z] = -2X\,,\,[Y,Z] = 2Y$

The isomorphism $\phi:sl(2,\Bbb R) \to su(2)$ must preserve the brackets, let's use the last one

$\phi(2Y) = \phi([Y,Z]) = [\phi(Y),\phi(Z)]$

$A\equiv \phi(Y)\in su(2)\,,\,B\equiv\phi(Z)\in su(2)$

$[A,B] = 2A$

Since $A$ is Hermitian we have

$A = W^*\Lambda W\,,\,W^*W=I$, where $\Lambda$ is a diagonal matrix

Substituting in $[A,B] = 2A$ and easy calculations show that

$[\Lambda,H] = 2\Lambda$

where $H\equiv WBW^* \in su(2)\,,\,\Lambda \in su(2)$

So we have that:

$\Lambda = \left[ \begin{array}{cc} s&0\\ 0&-s \end{array} \right]\,,\,H = \left[ \begin{array}{cc} r&\gamma^*\\ \gamma&-r \end{array} \right]\,,\,r,s\in\Bbb R$ and $\gamma \in \Bbb C $

and substituting in $[\Lambda,H] = 2\Lambda$ it's easy to see that matrix equation is satisfied only if $s = 0$

which would imply $\phi(Y)=A=\Lambda = 0$ and then $Y=0$ (cause $\phi$ is an isomorphism), which is absurd, so the brackets can't be preserved.

Andrea
  • 147