1

I started studying the book of Daniel Huybrechts, Complex Geometry An Introduction. I tried studying backwards as much as possible, but I have been stuck on the concepts of almost complex structures and complexification. I have studied several books and articles on the matter including ones by Keith Conrad, Jordan Bell, Gregory W. Moore, Steven Roman, Suetin, Kostrikin and Mainin, Gauthier

I have several questions on the concepts of almost complex structures and complexification. Here are some:

Assumptions and notations: Let $V$ be a $\mathbb C$-vector space. Let $V_{\mathbb R}$ be the realification of $V$. For any almost complex structure $I$ on $V_{\mathbb R}$, denote by $(V_{\mathbb R},I)$ as the unique $\mathbb C$-vector space whose complex structure is given $(a+bi) \cdot v := av + bI(v)$. Let $i^{\sharp}$ be the unique almost complex structure on $V_{\mathbb R}$ such that $V=(V_{\mathbb R},i^{\sharp})$. Let $\hat i: V_{\mathbb R}^2 \to V_{\mathbb R}^2$, $\hat i := i^{\sharp} \oplus i^{\sharp}$.

  • Let $W$ be an $\mathbb R$-vector space. Let $W^{\mathbb C}$ denote the complexification of $W$ given by $W^{\mathbb C} := (W^2,J)$, where $J$ is the canonical almost complex structure on $W^2$ given by $J(v,w):=(-w,v)$. Let $\chi: W^2 \to W^2$, $\chi(v,w):=(v,-w)$

  • For any map $f: V_{\mathbb R} \to V_{\mathbb R}$ and for any almost complex structure $I$ on $V_{\mathbb R}$, denote by $f^I$ as the unique map $f^I: (V_{\mathbb R}, I) \to (V_{\mathbb R}, I)$ such that $(f^I)_{\mathbb R} = f$. With this notation, the conditions '$f$ is $\mathbb C$-linear with respect to $I$' and '$f$ is $\mathbb C$-anti-linear with respect to $I$' are shortened to, respectively, '$f^I$ is $\mathbb C$-linear' and '$f^I$ is $\mathbb C$-anti-linear'. (see notation and definitions here, in particular the bullet below 'Definition 4')

  • The complexification, under $J$, of any $g \in End_{\mathbb R}W$ is $g^{\mathbb C} := (g \oplus g)^J$, i.e. the unique $\mathbb C$-linear map on $W^{\mathbb C}$ such that $(g^{\mathbb C})_{\mathbb R} = g \oplus g$

  • Let $H$ be an almost complex structure on $V_{\mathbb R}^2$

Questions:

  1. Are there $\mathbb R$-subspaces $U_1,U_2$ of $V_{\mathbb R}^2$ that satisfy the following conditions?

    • Condition 1.1. $U_1 \cong U_2$
    • Condition 1.2. Internally, $V_{\mathbb R}^2 = U_1 \bigoplus U_2$
    • For Conditions 1.3 and 1.4 below: Let $j=1,2$. Denote restriction of $H$ to $U_j$ by $H|_{U_j}:U_j \to V_{\mathbb R}^2$.
    • Condition 1.3. $image(H|_{U_j}) \subseteq U_j$, i.e. $H(U_j) \subseteq U_j$
    • For Condition 1.4 below: By Condition 1.3, we can define $\tilde{H|_{U_j}}: U_j \to U_j$
    • Condition 1.4. $\tilde{H|_{U_j}}$ is an almost complex structure on $U_j$.
  2. Whenever subspaces $U_1$ and $U_2$ as above exist, are they necessarily eigenspaces of eigenvalues of some map that is $\mathbb C$-linear with respect to $H$?

  3. (Additional question based on Observation 10.1 below) Actually, whenever subspaces $U_1$ and $U_2$ that satisfy Conditions 1.1-1.3 exist, do they satisfy Condition 1.4?

Observations for $W=V_{\mathbb R}$ that led to the questions above:

I refer to Suetin, Kostrikin and Mainin (12.13 of Part I) and Daniel Huybrechts, Complex Geometry An Introduction (Chapter 1.2)

  1. $\hat i$ is an almost complex structure on $V_{\mathbb R}^2$.

  2. $(\hat i)^J$ is $\mathbb C$-linear.

  3. For $H=J$, we can have $U_1=V^{1,0}=\{(v,-iv)|v \in V_{\mathbb R}\}$ and $U_2=V^{0,1}=\{(v,iv)|v \in V_{\mathbb R}\}$, which are the eigenspaces both of the eigenvalues, respectively, $\pm i$ of the map $(\hat i)^J$ and of the eigenvalues, respectively, $\pm i$ of the map $I^{\mathbb C} = (I \oplus I)^J$ for any almost complex structure $I$ on $V_{\mathbb R}$.

  4. By observation 1, we can consider $H=\hat i$.

  5. For $H=\hat i$, we can have once again $U_1=V^{1,0}$ and $U_2=V^{0,1}$, which are the eigenspaces of the eigenvalues $\pm i$ of the map $J^{\hat i}$.

  6. Even though $\chi^J$ is $\mathbb C$-anti-linear and $\chi$ is not an almost complex structure, we still have that $\chi^{\hat i}$ is $\mathbb C$-linear.

  7. By observation 6, $\chi^{\hat i}$ has eigenvalues.

  8. For $H=\hat i$, we can have once again $U_1=V_{\mathbb R} \times 0$ and $U_2=0 \times V_{\mathbb R}$, which are the eigenspaces of the eigenvalues, respectively, $\pm 1$ of the map $\chi^{\hat i}$.

  9. $\hat i$ restricts to almost complex structures on $V^{1,0}$, $V^{0,1}$, $V_{\mathbb R} \times 0$ and $0 \times V_{\mathbb R}$.

  10. $J$ restricts to almost complex structures on $V^{1,0}$ and $V^{0,1}$ but on neither $V_{\mathbb R} \times 0$ nor $0 \times V_{\mathbb R}$.

    • 10.1. Actually, $J$ does not even restrict to maps on $V_{\mathbb R} \times 0$ or $0 \times V_{\mathbb R}$.
BCLC
  • 13,459
  • 1
    What does the word 'almost' do here? – Berci Feb 01 '20 at 09:31
  • 1
    Answer on question 3 is yes: $H^2v=-v$ still holds for all vectors. The observations are about two (commuting) complex structures, while your questions only concern $H$ without any connection to $i^#$, if I understand correctly. – Berci Feb 01 '20 at 11:56
  • @Berci The purpose of the 'almost': Let $W$ be real vector space that is not odd-dimensional if $W$ were finite-dimensional. Let $W$ have scalar multiplication map $s_W$. The purpose of the 'almost' is to distinguish an automorphism $I: W \to W$ with the scalar multiplication map $s_W^I(a+bi,w) := s_W(a,w)+s_W(b,I(w))$. The automorphism $I$ is the almost complex structure while the scalar multiplication map $s_W$ is the complex structure. I mentioned this in another question but decided to not mention this here since I was afraid people would not read this if it were too long. Thanks. – BCLC Feb 03 '20 at 03:25
  • @Berci I'll get to your other comment in awhile. – BCLC Feb 03 '20 at 03:26
  • @Berci On your other comment, thanks! That was weird. It was obviously true. – BCLC Feb 03 '20 at 11:48
  • Please, try to focus and narrow down your questions in the future. – Aloizio Macedo Feb 14 '20 at 15:38
  • @AloizioMacedo I just figured it's like this other question, also answered by levap? or not really? – BCLC Nov 23 '20 at 13:38

2 Answers2

2

First, let me say that your choice of notation is quite non-standard and makes it almost impossible to understand what you are asking.

Let me try and rephrase your question(as far as I understand it). I'll assume that the vector spaces involved are finite dimensional. You start with a complex vector space $V$ with $\dim_{\mathbb{C}} V = n$ and choose an arbitrary complex structure $H$ on $W = ((V_{\mathbb{R}})^{\mathbb{C}})_{\mathbb{R}}$. The vector space $(V_{\mathbb{R}})^{\mathbb{C}}$ (together with the standard complex structure which you denote by $J$) is a complex vector space of dimension $$\dim_{\mathbb{C}} V_{\mathbb{R}}^{\mathbb{C}} = \dim_{\mathbb{R}} V_{\mathbb{R}} = 2 \dim_{\mathbb{C}} V = 2n$$ and so $W$ is a real vector space of dimension $4n$. The vector space $(W,H)$ is a complex vector space of dimension $2n$ and so clearly we can find two complex subspaces $U_1,U_2$ of $(W,H)$ of dimension $n$ such that $W = U_1 \oplus U_2$. The fact that they are complex (with respect to $H$) implies that $H(U_i) \subseteq U_i$ and the restriction of $H$ to $U_i$ is a complex structure. Since they have the same dimension, the subspaces $(U_1,H)$ and $(U_2,H)$ are $\mathbb{C}$-isomorphic and in particuar $\mathbb{R}$-isomorphic.

So the answer to your first question is yes. You can definitely describe them (in infinitely many ways) as the eigenvalues of an $\mathbb{C}$-linear map of $(W,H)$ (for example, take a projection onto one of the factors with respect to the other) so the answer to your second question is yes and also to the third one.

The point is that your question, after you manage to unravel all the details, has nothing to do with complexification. At least in the finite dimensional case, you ask whether a complex vector space of even (complex) dimension can be written as a direct sum of two isomorphic complex subspaces.

levap
  • 65,634
  • 5
  • 79
  • 122
  • Re notation and difficulty (or even impossibility) of understanding: I mentioned more of the notation in another question but decided to not mention here since I was afraid people would not read this if it were too long. If this were a paper or book I were writing, then I would be absolutely clear. I believe I understood the risk I was taking. Thanks. I'll get to your answer in awhile. – BCLC Feb 03 '20 at 03:26
  • Update on notation: I didn't mention anymore the difference between almost complex and complex (re Berci's comment), but I mentioned more about my notation of $\text{some map}^{\text{some almost complex structure}}$ – BCLC Feb 03 '20 at 12:02
  • Okay, I got around to analysing your answer, and I believe I understand it assuming $W=V_{\mathbb R}^2$ and assuming some of the $U_i$'s are actually meant to be $(U_i)_{\mathbb R}$. You didn't quite answer for case of infinite-dimensional, but I think I know how to do this under axiom of choice assuming I understand correctly what it means for infinite dimensional vector spaces to have 'equal dimension'. I'm going to post an answer, but I'll still accept your answer. Thanks, levap! – BCLC Feb 03 '20 at 12:07
  • 1
    Sure. If $V$ is infinite dimensional, then $W$ will also be infinite dimensional and so $(W,H)$ will also be a complex infinite dimensional vector space and then using the axiom of choice you can just choose a basis and split it into two subsets having the same cardinality (see https://math.stackexchange.com/questions/2281940/splitting-infinite-sets-into-disjoint-sets-of-equal-cardinality) and use the two parts to define $U_1$ and $U_2$. Then everything above continues to hold. – levap Feb 03 '20 at 12:35
  • Thanks! Same as in here right? I posted an answer just now. – BCLC Feb 03 '20 at 12:51
  • 'First, let me say that your choice of notation is quite non-standard and makes it almost impossible to understand what you are asking.' --> But the notation kinda grows on you right? – BCLC Nov 23 '20 at 13:39
0

Based on levap's answer, I'm going to answer Question 1 where I include infinite-dimensional vector spaces and answer Question 2 by writing out the details of levap's answer on projection. (Question 3 was actually pretty obviously yes.)

For Question 2:

Let $j,k \in \{1,2\}$ with $j \ne k$. Let $A = (V_{\mathbb R}^2, H) = A_1 \bigoplus A_2 = \bigoplus_{j=1}^{2} A_j$. Let $\pi_j: A \to A_j$ be projection given by $\pi_j(w_1 + w_2):=w_j$.

  1. Prove $\pi_j$ preserves addition: Let $v_j, w_j \in A_j$. Then $\pi_j(v_1+v_2 + w_1+w_2)$$=\pi_j(v_1+w_1+v_2+w_2)$$=v_j+w_j=v_j+w_j+0_A=\pi_j(v_j+w_j)+\pi_j(v_k+w_k)$.

  2. Prove $\pi_j$ preserves real scalar multiplication: $\pi_j(r(w_1+w_2)) = \pi_j(rw_1+rw_2) = rw_j$$=r$$\pi_j($$w_1+w_2)$

  3. Prove $\pi_j$ preserves scalar multiplication by i, i.e. commutes with $H$: $\pi_j(i(w_1+w_2)) = \pi_j(H(w_1+w_2))$$ = \pi_j(H(w_1)+H(w_2))$. Now $H(w_j) \in A_j$ if $w_j \in A_j$ (and only if I guess...by considering $H^{-1}$). Then $\pi_j(H(w_1)+H(w_2)) = H(w_j)$. Finally, $i(\pi_j(w_1+w_2))=H(\pi_j(w_1+w_2))=H(w_j)$.

  4. Prove $\pi_j$ has exactly 2 eigenvalues $\lambda_j$, where $\lambda_j$ has eigenspace $A_j$: Let $v \in A \ \setminus \{0_A\}$. Because $\pi_j$ is idempotent, we have $\pi_j(v)=\lambda v$ if and only if $\pi_j(v)=\lambda^2 v$ if and only if $\pi_j(v)=\lambda^n v$ for every positive integer $n$ if and only if $\lambda^n v = \lambda v$ for every positive integer $n$ if and only if $\lambda^n = \lambda$ for every positive integer $n$ if and only if $\lambda = 0,1$. Then $\lambda_j = 1$ has eigenspace $A_j$ and $\lambda_k=0$ has eigenspace $A_k$.

For Question 1:

My understanding of $K$-vector space $A$ and $L$-vector space $B$, where $A$ and $B$ may be infinite dimensional is that they have the same 'dimension' if there exists a bijection between any $K$-basis of $A$ and any $L$-basis of $B$. (I think $K$ and $L$ can be any fields such that neither need be a field extension or subfield or embedded subfield of the other, but anyway $\mathbb R$ is an embedded subfield of $\mathbb C$.)

Thus, we extend the idea of saying that $\dim V_{\mathbb R} (=n)=\dim (V_{\mathbb R}^2,H) (=\dim (V_{\mathbb R}^2,J))$ to saying that there exists a bijection between any $\mathbb R$-basis of $V_{\mathbb R}$ and any $\mathbb C$-basis of $(V_{\mathbb R}^2,H)$. I guess we don't use axiom of choice here in that we could have a vacuous truth, but I guess we will use axiom of choice as follows, as done here:

  1. By axiom of choice, let $V_{\mathbb R}$ have basis $\{e_a\}_{a \in I}$.

  2. Once again by axiom of choice, there exists a decomposition of $I$ into $I = I_1 \cup I_2$ such that $I_1 \cap I_2 = \emptyset$ and that there exists a bijection $\varphi: I_1 \to I_2$.

  3. By (2), $V_{\mathbb R} = S_1 \bigoplus S_2$ with $S_j = \mathbb R-$span$(\{e_a\}_{a \in I_j})$. Actually, $S_j$ has $\mathbb R$-basis $\{e_a\}_{a \in I_j}$.

  4. By axiom of choice for the third and I think last time, let $(V_{\mathbb R}^2,H)$ have basis $\{f_m\}_{m \in M}$.

  5. By (1) and (4), the equal dimension thing is equivalent to the existence of a bijection $\gamma: \{e_a\}_{a \in I} \to \{f_m\}_{m \in M}$ and a bijection $\eta: I \to M$. Then we have $\gamma: \{e_a\}_{a \in I} \to \{f_{\eta(a)}\}_{a \in I}$.

  6. Check that $A_j$ defined as the subset $(V_{\mathbb R}^2,H)$ that is $\mathbb C$-spanned by $\{f_{\eta(a)}\}_{a \in I_j}$ is a $\mathbb C$-subspace (with respect to $H$ of course) such that its $\mathbb C$-basis is $\{f_{\eta(a)}\}_{a \in I_j}$.

  7. By (3), (5) and (6), we can decompose $(V_{\mathbb R}^2,H)$ as a literal internal direct sum of $\mathbb C$-subspaces (with respect to $H$ of course): $(V_{\mathbb R}^2,H) = A_1 \bigoplus A_2$ with $A_j$ having $\mathbb C$-basis $\{f_{\eta(a)}\}_{a \in I_j}$

  8. Finally, choose $U_j = (A_j)_{\mathbb R}$: We can do this by (7) and by the fact that '$H(U_j) \subseteq U_j$' is equivalent to $A_j$ is a $\mathbb C$-subspace (with respect to $H$ of course) of $(V_{\mathbb R}^2,H)$.

BCLC
  • 13,459