2

I was asking myself the following question : when does an element of $\mathrm{Sl}_n(\mathbb{R})$ preserve a certain scalar product ? A simple necessary condition is that it's complex eigenvalues are of modulus 1. I think it must be diagonalizable over $\mathbb{C}$ too.

Can one find a sufficient condition ?

Selim Ghazouani
  • 2,234
  • 15
  • 24

2 Answers2

1

The necessary condition you stated is also sufficient. Indeed, suppose $A\in SL_n({\mathbb R})$ is diagonalizable over $\mathbb C$ and all that its eigenvalues have modulus $1$.

We need the following :

Kernel invariance property Let $L/K$ be any extension of fields. Let $B$ be a $n\times n$ matrix with coefficients in $K$. Then there is a basis in $K^n$ of ${\sf Ker}_{K^n}(B)$, that can also serve as a basis of ${\sf Ker}_{L^n}(B)$ (in particular the two kernels have the same dimension).

This invariance property follows form Gauss’ method to reduce to row echelon form. The computations will be identical in $K^n$ and in $L^n$. And as this method only involves additions, substractions and multiplications, if the coefficients of $B$ are in $K$, all the computations will stay in $K$.

Denote the real eigenvalues of $A$ by $\lambda_1,\lambda_2, \ldots ,\lambda_r$ and $\alpha_1\pm i\beta_1, \alpha_2\pm i\beta_2, \ldots, \alpha_s\pm i\beta_s$. Thus each $\lambda_j$ is $\pm 1$ and each $\alpha_j^2+\beta_j^2$ is $1$.

Since $A$ is diagonalizable over $\mathbb C$, there is a basis $(b_1,b_2,\ldots ,b_n)$ of ${\mathbb C}^n$ with $Ab_{2j-1}=(\alpha_j-i\beta_j)b_{2j-1}$ and $Ab_{2j}=(\alpha_j+i\beta_j)b_{2j-1}$ for $1\leq j\leq s$, and $Ab_k=\lambda_kb_k$ for $2s < k \leq n$. For every $j$ we can write $b_{2j}=x_j+iy_j$ where $x_j$ and $y_j$ have real coordinates. As $A$ has real coefficients, we see that for every $j$ we have

$$ Ax_j=\alpha_j x_j-\beta_j y_j, Ay_j=\beta_j x_j+\alpha_j y_j \tag{1} $$

By the kernel invariance property (consider $B=A-\lambda I$ for $\lambda$ an eigenvalue), we can assume without loss that the $b_k (2s < k\leq n)$ have real coefficients.

Let $\alpha+\beta i$ be a complex eigenvalue of $A$. We shall denote by $K_{\mathbb R}$ ($K_{\mathbb C}$) the kernel of $(A-\alpha I)^2+\beta^2$ in ${\mathbb R}^n$ (${\mathbb C}^n$). By construction, $K_{\mathbb R}$ and $K_{\mathbb C}$ are invariant by $A$. Let $W$ be a subspace of $K_{\mathbb R}$ satisfing $W \cap AW=\lbrace 0 \rbrace$. Let $V=W \oplus AW$. Then $V$ is invariant by $A$ also. Suppose that $V \neq K_{\mathbb R}$ ; then, we have a vector $x$ in $K_{\mathbb R}$ but not in $V$. Let $V'={\sf span}(x)\oplus V$. I claim that $Ax\not\in V'$. For otherwise, we would have some $\gamma\in{\mathbb R}$ such that $t_1=Ax-\gamma x \in V$. Then $t_2=A(Ax-\gamma x) \in V$, but $t_2=A(Ax-\gamma x)=(2\alpha)Ax-x-\gamma Ax$. Then $t_3=t_2+(\gamma-2\alpha)t_1 \in V$, but $t_3=-((\gamma-\alpha)^2+\beta^2)x$. That would imply $x\in V$, a contradiction. All this shows that $W'=W\oplus {\sf span}(x)$ still satisfies $W' \cap AW'=\lbrace 0 \rbrace$. So if we take a $W$ of maximal dimension with this property, we must have $K_{\mathbb R}=W \oplus AW$. Let $(x^1,x^2,\ldots ,x^t)$ be a basis (in ${\mathbb R}^n$) of $W$. Put $y^i=\frac{\alpha x-Ax}{\beta}$. Then $(y^1,y^2,\ldots ,y^t)$ forms a basis for $AW$ in ${\mathbb R}^n$, and $(x^1\pm i y^1,x^2\pm i y^2, \ldots, x^t\pm i y^t)$ forms a basis of $K_{\mathbb C}$. So, in (1) we may assume without loss that the family $(x_j,y_j)_{1\leq j \leq s}$ is linearly independent.

The $b_k,x_j,y_j$ form a basis of ${\mathbb R}^n$ when taken together. Then, the scalar product for which this basis is orthonormal is preserved by $A$.

Ewan Delanoy
  • 61,600
  • Actually I don't really understand how you find the $f_j$ and $g_j$. Precisely, why do their coefficients belong to $\mathbb{R}$ since the diagonalization ii over $\mathbb{C}$ ? – Selim Ghazouani Dec 11 '13 at 14:34
  • @SelimGhazouani Please see my updated version. – Ewan Delanoy Dec 11 '13 at 16:00
  • Ok I understand how you want to proceed but now we should check that $e_i,x_j,y_j$ actually form a basis for $\mathbb{R}^n$. I think that using computation you used to prove that $x_j$ and $y_j$ are linearly independent should work but I doesn't seem obvious. – Selim Ghazouani Dec 11 '13 at 16:32
  • @SelimGhazouani This follows from the fact that the $e_j$ and the $x_j+iy_j$ form a basis (of eigenvectors) over $\mathbb C$. – Ewan Delanoy Dec 11 '13 at 16:49
  • I'm not sure. Here you are using the fact that complex eigenvalues are conjugates, and for each pair of conjugate you use only one eigenvector and take the real and imaginary part. DO you see what I mean ? – Selim Ghazouani Dec 11 '13 at 17:16
  • @SelimGhazouani Please see my updated version. – Ewan Delanoy Dec 12 '13 at 05:44
0

Suppose $A$ is diagonalisable over $\mathbb C$ and all its eigenvalues have unit moduli. Then $A=PDP^{-1}$ for some invertible matrix $P$ and some unitary diagonal matrix $D$. Let $P=USV^\ast$ be a singular value decomposition and let $G = US^{-2}U^\ast$. Then $G$ is positive definite and $$ A^\ast GA = (US^{-1}V^\ast D^\ast VSU^\ast)(US^{-2}U^\ast)(USV^\ast D VS^{-1}U^\ast) = G. $$ Hence $A$ preserves the inner product induced by $G$.

user1551
  • 139,064