2

For given $x,y \in \mathfrak{g}, [x,y], [x,[x,y]] \neq 0$, does there exist a $z \neq 0 \in \mathfrak{g}$ such that

$$\operatorname{ad}^2_xy = \operatorname{ad}_zy$$

For simplicity, let's think of the case of $\mathfrak{g} = End(V)$, with the commutator $\operatorname{ad}_xy = xy-yx$

I suspect the answer is no, but I can't think of "proper" reason other than computing the LHS and seeing terms like $xyx$ that a priori will dissallow writing a $zy$.

Edit: as pointed out by user1551 there are certain matrices that this will never work, but it's not a done deal, when and why does this happen? Also note that if we work with the Lie algebra of vectors in $\mathbb{R}^3$ with the cross product, my question becomes when can the BAC - CAB triple product identity be expressed as a cross product. i.e. solving:

$$x(x \cdot y) - y\|x\|^2 = z \times y$$

Thus, a partial proof for the $\mathfrak{s}\mathfrak{o}(3)$ case is if $x \perp y$ the resulting equation reads:

$$-\|x\|^2 y = z \times y$$

Which can have no nontrivial solution, as geometrically we are asking $y$ to be orthogonal to a scalar multiple of itself. This is what happens in user1551's example too, as the matrices chosen are orthogonal under the trace pairing, which is the Killing form for the Lie algebra $End(V)$, thus to improve my question, suppose the case that $x \not\perp y$, or when the Killing form is degenerate and we can't define such notions of orthogonality.

Edit2: see the comments for a very slick proof by user1551 that the solution to this problem is negative for $\mathfrak{g}=\mathfrak{s}\mathfrak{o}(3)$

  • $x$ and $y$ are fixed and don't commute. I would like to know if a nonzero $z$ can exist. – Theo Diamantakis Aug 14 '23 at 15:35
  • 1
    Then it should be easy to find some $x$ and $y$ such that the equation is insolvable. E.g. when $x=\pmatrix{0&1\ 1&0}$ and $y=\pmatrix{1&0\ 0&0}$, we have $\operatorname{ad}_x^2y=x^2y-2xyx+yx^2=\pmatrix{2&0\ 0&-2}$. However, since $y$ is a diagonal matrix, $\operatorname{ad}_zy$ must be hollow. Hence the equation is not solvable when the characteristic of the ground field is not $2$. – user1551 Aug 14 '23 at 15:41
  • Very good. I am wondering then if there is a Lie algebraic property that controls when this does or does not happen. Also, by hollow you mean zero matrix right? edit: no never mind I see what you mean. – Theo Diamantakis Aug 14 '23 at 15:44
  • Not necessarily zero. A hollow matrix means a matrix with a zero diagonal. – user1551 Aug 14 '23 at 15:45
  • The question is non obvious (to me) whatever is known and unknown, so in a sense, it doesn't matter in terms of the essence of the problem. If all 3 variables were known you still have to work hard and see if it makes sense. And if $z$ was known it still reduces to finding out if it is possible to break up $\operatorname{ad}$ this way. Regardless I added the details if it helps. – Theo Diamantakis Aug 14 '23 at 19:11
  • 1
    A tangential remark: for $x(x \cdot y) - y|x|^2 = z \times y$ to be solvable in $z\in\mathbb R^3$, we must have $(x \cdot y)^2-|y|^2|x|^2=\left(x(x \cdot y) - y|x|^2 - z \times y\right)\cdot y = 0$. Hence the equation is solvable only if $x$ and $y$ are linearly dependent. When this is the case, $x(x \cdot y) - y|x|^2 = 0$. Thus the general solution is given by $z\in {y}^\perp$. – user1551 Aug 14 '23 at 20:11
  • I see. And of course, linear dependence would force commutativity of $x,y$ correct? So in the case $\mathfrak{g} = \mathfrak{s}\mathfrak{o}(3)$ the answer is a definite no, as a consequence of I suppose geometry and Cauchy-Schwartz? Very interesting, but also seems very specialised to this Lie algebra in particular. But I think this supports the conclusion of no in general. – Theo Diamantakis Aug 14 '23 at 20:19
  • An extreme example, the Jacobi Lie bracket on the one dimensional vector fields on $\mathbb{R}$. Computing the equation with $X(x)\frac{\partial}{\partial x}$ and $Y(x)\frac{\partial}{\partial x}$ and trying to match derivatives of $Y$ leads to the constraint $X^2 Y'' = 0$, so either $X=0$ or $Y$ is affine which forces the second bracket to be zero. I am convinced the problem has no nontrivial solution in general, because the LHS has "two derivations" of $y$ whilst the RHS has only one. Forcing a zero somewhere is the only way to reconcile them to agree. – Theo Diamantakis Aug 14 '23 at 20:51
  • If $x$ (hence $ad_x$) is given and you want to find $ad_z$ with $ad_x \circ ad_x = ad_z$, at least on some given $y$ (which is how I read the question), would that not be finding the (composition) "square" instead of the (composition) "square root" (which is e.g. the $g$ in this question: https://math.stackexchange.com/q/1118/96384) ? – Torsten Schoeneberg Aug 14 '23 at 21:22
  • Yes, I suppose it's a square in the sense of the "Freshman's $\operatorname{ad}$", satisfying $\operatorname{ad}x \circ \operatorname{ad}_x = \operatorname{ad}{x^2} $ (obviously not actually true) we're squaring $x$. I was imagining it as defining $\operatorname{ad}_x = L_x$ and turning $L_x^2 = L_z$ but I suppose that's actually squaring. – Theo Diamantakis Aug 14 '23 at 21:38

1 Answers1

2

All nontrivial split semisimple Lie algebras (and hence all the ones which would have such as quotient) contain counterexamples.

In $\mathfrak g:= \mathfrak{gl}_n(k) \simeq (M_n(k), [\cdot,\cdot])$ for $n\ge 2$, take $x=\pmatrix{0&\dots &1\\\vdots&&\vdots\\ 0&\dots&0}$, $y=\pmatrix{0&\dots &0\\\vdots&&\vdots\\ 1&\dots&0}$. Then $ad_x(ad_x(y))=-x$ but there is no $z\in \mathfrak g$ with $ad_z(y) =-x$.

In fact, if $\mathfrak g$ is any split semisimple Lie algebra with a chosen Cartan subalgebra, corresponding roots $\alpha \in \Phi$ and root spaces $\mathfrak g_{\alpha}$, then for any given root $\alpha$ and bases $e_\alpha$ of $\mathfrak g_\alpha$, take $x=e_\alpha$ and $y=e_{-\alpha}$. Then $ad_x(ad_x(y))$ is a non-trivial element of $\mathfrak g_\alpha$ but there is no element $z$ such that the $\alpha$-component of $ad_z(y)$ would be nonzero.

One can probably generalize this even further to non-split cases. Note also that the two elements are not orthogonal wrt the Killing form.

(And of course, over there are also always non-trivial cases where solutions exist. In fact, if $[x,y] = \lambda y$ for a nonzero $\lambda \in k$, then $ad_z(y) = ad_x(ad_x(y))$ for $z=\lambda x$.)

  • Thank you, I know a little about root systems but not how to manipulate them. Can you explain why $\operatorname{ad}{e\alpha} \operatorname{ad}{e\beta}$ would have nonzero $\alpha$ component? – Theo Diamantakis Aug 15 '23 at 07:39
  • I never claim that. What I do claim is that $ad_{e_\alpha} ad_{e_\alpha} $ (same $\alpha$!) maps a generator of $\mathfrak g_{-\alpha}$ to a generator of $\mathfrak g_\alpha$. That is rather basic Lie theory, proven e.g. with $\mathfrak{sl}_2$-triples. – Torsten Schoeneberg Aug 16 '23 at 03:46
  • Ok I think I understand after reading a bit. Maybe this is trivial, but if $[\mathfrak{g}\alpha, \mathfrak{g}\beta ] = \mathfrak{g}{\alpha + \beta}$ for two root spaces why is there no $\beta = 2 \alpha $ what would make a good choice of $z$? to go from $\mathfrak{g}{-\alpha}$ to $\mathfrak{g}_{\alpha}$ – Theo Diamantakis Aug 16 '23 at 08:42
  • As in, I think your argument goes that the two $\operatorname{ad}$ application shunts you from the $-\alpha$ space to the $\alpha$ space, and that it's impossible to "do it in one". Is that correct? But that depends on there being no $e_{2\alpha}$ like element. – Theo Diamantakis Aug 16 '23 at 08:50
  • Ah, it is a property of the root system that the only scalar multiple is the minus. – Theo Diamantakis Aug 16 '23 at 08:59
  • @TheoDiamantakis note that even if there were such a $\beta = 2\alpha$ (which, as you note, there isn't for a reduced root system) that would only shunt the problem to the existence of a $2\beta$ and so on. Over finite dimensions that would already be a problem. – Callum Aug 17 '23 at 21:32
  • 1
    I think this argument extends neatly to all non-compact examples using a restricted root system. That introduces the possibility of roots $\alpha$ where $2\alpha$ is a root but even then $4\alpha$ is not a root so you can choose $e_{2\alpha}$ – Callum Aug 17 '23 at 21:41
  • @Callum: Agreed. I have been thinking about compact or general anisotropic ones without much success. Any ideas there? – Torsten Schoeneberg Aug 17 '23 at 22:43
  • 1
    I think this might work: a compact semisimple Lie algebra contains some elements of the form $e_\alpha + e_{-\alpha}$ where $e_\alpha, e_{-\alpha}$ are root vectors for a CSA of the complexified Lie algebra containing a given maximal torus in the compact one, then $\operatorname{ad}(e_\alpha + e_{-\alpha})$ preserves the CSA and the torus, and has image in the span of $ih_\alpha$. But a CSA is self-normalising and abelian so there is no element $x$ for which $\operatorname{ad}(x)$ preserves the torus but which doesn't send it to $0$. – Callum Aug 17 '23 at 23:24