I started studying the book of Daniel Huybrechts, Complex Geometry An Introduction. I tried studying backwards as much as possible, but I have been stuck on the concepts of almost complex structures and complexification. I have studied several books and articles on the matter including ones by Keith Conrad, Jordan Bell, Gregory W. Moore, Steven Roman, Suetin, Kostrikin and Mainin, Gauthier
I have several questions on the concepts of almost complex structures and complexification. Here is one:
Some context: The question below is related to a question a posted previously.
Motivation for question below: From this question or wikipedia, we get that each almost complex structure $J \in Aut_{\mathbb R}(\mathbb R^2)$ (defined as $J^2 = -id_{\mathbb R^2}$, i.e. $J$ is anti-involutive) is represented, under the group isomorphism $Aut_{\mathbb R}(\mathbb R^2) \cong GL(2,\mathbb R)$, by a matrix
$$J \cong \left[ \begin{array}{cc} a & b \\ \frac{-1-a^2}{b} & -a \end{array} \right], a,b \in \mathbb R, b \ne 0$$
This is because we can write $J \in (End_{\mathbb R}(\mathbb R))^{2 \times 2}$ as
$$J = \left[ \begin{array}{cc} \hat a & \hat b \\ (-1-\hat a^2)(\hat b)^{-1} & -\hat a \end{array} \right], \hat a,\hat b \in End_{\mathbb R}(\mathbb R), \hat b \in Aut_{\mathbb R}(\mathbb R),$$
where $\hat a := a \ id_{\mathbb R}$ is the unique element of $End_{\mathbb R}(\mathbb R)$, under the $\mathbb R$-vector space isomorphism $End_{\mathbb R}(\mathbb R) \cong \mathbb R$, that is scalar multiplication by $a$. Likewise for $\hat b$.
Similarly, each $\sigma \in Aut_{\mathbb R}(\mathbb R^2)$ that anti-commutes with $J$ (i.e. $\sigma \circ J = - J \circ \sigma$) and is involutive (i.e. $\sigma^2 = id_{\mathbb R^2}$) can be represented, again under $Aut_{\mathbb R}(\mathbb R^2) \cong GL(2,\mathbb R)$, by a matrix
$$\sigma \cong \left[ \begin{array}{cc} \cos(2t) & \sin(2t) \\ \sin(2t) & -\cos(2t) \end{array} \right], t \in [0,\pi)$$
This is because we can write $\sigma \in (End_{\mathbb R}(\mathbb R))^{2 \times 2}$
$$\sigma = \left[ \begin{array}{cc} \hat\cos(2t) & \hat\sin(2t) \\ \hat\sin(2t) & -\hat\cos(2t) \end{array} \right], t \in [0,\pi)$$
Question:
Let $V$ be an $\mathbb R$-vector space, possibly infinite-dimensional. Under the $\mathbb R$-vector space isomorphism $(End_{\mathbb R}(V))^{2 \times 2} \cong End_{\mathbb R}(V^2)$, what are the formulas, in terms of block matrices with elements in $(End_{\mathbb R}(V))^{2 \times 2}$, for almost complex structures $J$ on $V^2$ and for conjugations $\sigma$ on $V^2$ with respect to canonical $I(v,w):=(-w,v)$ (i.e. I refer to $\sigma^2 = id_{V^2}$ and $\sigma \circ I = - I \circ \sigma$)?
Some notes:
- For example, for canonical $I(v,w):=(-w,v)$, we get $$I = \left[ \begin{array}{cc} \hat 0_V & -id_V\\ id_V & \hat 0_V \end{array} \right],$$
where $0_V$ is the zero of $V$ and where $\hat 0_V$ is the zero element of $End_{\mathbb R}(V)$, which is also the constant map, on $V$, with value $0_V$.
- Guess as to what the answer is not: $$J = \left[ \begin{array}{cc} a \ id_V & b \ id_V \\ \frac{-1-a^2}{b} \ id_V & -a \ id_V\end{array} \right], a,b \in \mathbb R, b \ne 0$$ $$ \sigma =\left[ \begin{array}{cc} \cos(2t) \ id_V& \sin(2t) \ id_V \\ \sin(2t) \ id_V & -\cos(2t) \ id_V \end{array} \right], t \in [0,\pi)$$
I mean that for $J, \sigma \in (End_{\mathbb R}(V))^{2 \times 2}$, I guess the elements are not necessarily going to be 'multiplication by a scalar' or 'scalar multiple of the identity' (see here or here for every non-zero is eigenvector; see here or here for commutes with every linear operator - I guess this part is related to Schur's Lemma; see here or here for commutes with particular matrices)
2.1. I think these elements are called 'homothety'.
- What I tried for $J$:
3.1. Solving $\left[ \begin{array}{cc} a & b \\ c & d \end{array} \right]\left[ \begin{array}{cc} a & b \\ c & d \end{array} \right] = \left[ \begin{array}{cc} -id_V & \hat 0_V \\ \hat 0_V & -id_V \end{array} \right], a,b,c,d \in End_{\mathbb R}(V)$ appears to just give me $a^2+bc=-id_V$, $ab+bd=\hat 0_V$, $ca+dc=\hat 0_V$, $cb+d^2=-id_V$. I'm stuck on this part. Also, I'm not sure if any 2 of $a,b,c,d$ commute.
3.1.1. Well I guess a and d are 'anti-similar', a term I just made up to mean that a and -d are similar, if b or c is invertible.
3.1.2. Also, in $id_V+a^2+bc=\hat 0_V$, we could have $b=0$ or $c=0$ in which case $a$ is an almost complex structure on $V$ and thus $V$ is infinite-dimensional or even-dimensional.
3.1.3. Then there's Sylvester equation, according to Omnomnomnom's answer here.
3.1.4. I think the centre of $End_{\mathbb R}V$, which I think is equal to the subset of the homothety elements by Schur's Lemma, can be parametrised as done by luis in this answer assuming it makes sense to talk about $e^x$ (e.g. by matrix exponential or by exponential map) for any $x \in End_{\mathbb R}V$ or at least for any $x \in$ the centre of $End_{\mathbb R}V$. I guess that luis' answer applies if we can somehow say get that certain exponentials commute even if the underlying maps do not commute.
3.2. Also, I'm not sure if it's helpful to use the fact that any almost complex structure $J$ on $V^2$ is similar to the canonical almost complex structure $I$ on $V^2$ (Actually, I'm not sure if this is true for infinite dimensional $V$).
- What I tried for $\sigma$:
4.1. Solving $\left[ \begin{array}{cc} a & b \\ c & d \end{array} \right]\left[ \begin{array}{cc} \hat 0_V & -id_V\\ id_V & \hat 0_V \end{array} \right] $$= - \left[ \begin{array}{cc} \hat 0_V & -id_V\\ id_V & \hat 0_V \end{array} \right] \left[ \begin{array}{cc} a & b \\ c & d \end{array} \right]$ for $a,b,c,d \in End_{\mathbb R}(V)$ appears to give me $c=b$, $d=-a$.
4.2. Solving $\left[ \begin{array}{cc} a & b \\ b & -a \end{array} \right]\left[ \begin{array}{cc} a & b \\ b & -a \end{array} \right] = \left[ \begin{array}{cc} id_V & \hat 0_V \\ \hat 0_V & id_V \end{array} \right]$ appears to give me $a^2+b^2=id_V$ and $ab=ba$. I'm stuck on this part unless $(a,b)=(\hat \cos(2t),\hat \sin(2t))$, $t \in [0,\pi)$, in which case I do not know how to justify that $a$ and $b$ are multiplication by a scalar (maybe one of the links in bullet 2 could be helpful).
I think every almost complex structure $J$ has $\det J = 1$ (really $1$ and not $\pm1$) and every conjugation $\sigma$ (at least with respect to canonical $I$; I didn't yet try for arbitrary $J$) has $\det \sigma = -1$ (really $-1$ and not $\pm1$), but I'm not sure how this would help us find $a,b,c,d$. There's probably some rule of determinants of block matrices that's relevant, but I wasn't able to find any on stackexchange or in the matrix cookbook.
Note: For both $J$ and $\sigma$, I first tried with $V = \mathbb R^{n}$, $n \ge 2$, and I got pretty much the same as what I described above for arbitrary $V$.
For $V=0$, we have $J=\sigma=id_{V^2}=\hat 0_{V^2} = \hat 0_{V} \oplus \hat 0_{V} = id_{V} \oplus id_{V}$.
The $J$ part has been asked about here, at least for $V = \mathbb R^n$ (The asker, Sandro Vitenti, has also commented in an answer to an above linked question).
For bullets 3.1 and 4.2, perhaps there is not really a way to simplify this such that there isn't a 'closed-form' or 'constructive' formula, but I kind of have a feeling there is. At the very least, I'm hoping for a way to represent $J$ by 2 parameters instead of 4 and to represent $\sigma$ by 1 parameter instead of 2.