1

If $S$ is real skew-symmetric matrix, then prove that $\det(I+S) \ge 1+\det(S)$,The equality holds if and only if $n\le 2$ or $n\ge3$ and $S=O$

Actually I only have trouble in proving that if and only if all the eigenvalues of $S$ is $0$ , then $S=O$.

Ross Ren
  • 251

2 Answers2

3

I hope this helps :

Since skew-symmetric is normal, i.e commute with adjoint, you know that by complexification of the vector space, we have canonical form for those operator. For example for $f$ such that $f = -f^{*}$ (as for skew-symmetric) $\exists P^{-1}= P^t$ such that $$A = P^{-1}S P = \begin{pmatrix}\textbf{0} & 0 \\ 0 & \Box\end{pmatrix}$$

Where the first $0$ are $k$ zero relative to the eigenvectors of $0$ eigenvalue of $f$, and the other part are $j$ (indeces are not very important here) block $\Box_\mu = \begin{pmatrix}0 & \mu \\ -\mu & 0 \end{pmatrix}$ relative to the purely imaginary eigenvalues of $f$ (Here $\mu \in \mathbb{R}$ and this is very important).

That being sad, since $\det(P)^2 = 1$ (being $P$ orthogonal) playing with Binet :

$$\text{det}(Id+S) = 1\cdot \text{det}(Id+S) = \text{det}(P)^2 \text{det}(Id+S) = \text{det}(P^{-1})\text{det}(Id+S)\text{det}(P) = \text{det}(P^{-1}(Id+S)P) = det(P^{-1}Id P + P^{-1}SP) = \text{det}(Id+ A)$$

It should clear that $I_d+A = \begin{pmatrix}Id_k & 0 \\ 0 & \Box\end{pmatrix}$ where the blocks now turned into $B_\mu = \begin{pmatrix}1 & \mu \\ -\mu &1 \end{pmatrix}$

And since the matrix is block diagonal we have that $\text{det}(Id+A) = 1^k \prod\limits_{i=1}^j B_{\mu_i}$.

Note that having $\text{det}( B_{\mu_i}) = 1 + \mu_i^2$, we end up with $\text{det}(I_d+A) = \prod\limits_{i=1}^j (1+\mu_i^2)$.

It should be clear now that

$$\prod\limits_{i=1}^j (1+\mu_i^2) \geq 1+ \prod\limits_{i=1}^j \mu_i^2 = 1 + \prod\limits_{i=1}^j \text{det}(\Box_{\mu_i}) = 1 + 1^k\prod\limits_{i=1}^j \text{det}(\Box_{\mu_i}) = 1 + \text{det}(P^{-1}SP) = 1 + \text{det}(P)^2\text{det}(S) = 1+\text{det}(S)$$

Edit : I think if this is right you can easily find the special cases for $n$ by yourself.

$\textbf{Addendum :}$ Rewatching this "proof" it seems to me that the equality holds if and only if holds $\prod\limits_{i=1}^j (1+\mu_i^2) = 1+ \prod\limits_{i=1}^j \mu_i^2$. If you are familiar with the first product, you should know or you could prove that $\prod\limits_{i=1}^j (1+\mu_i^2) = \sum\limits_{J \subset[j]}\prod\limits_{i \in J} \mu_i^2$. From this equality you see that holds if and only if $\mu_i^2 = 0$ for every $i$ since we're dealing with real numbers, i.e $\mu_i = 0$ for all $i$. But if we remember who $\mu_i$ was, we get $S = 0$ since $0$ is the only matrix similar to the $0$ matrix.

jacopoburelli
  • 5,564
  • 3
  • 12
  • 32
  • Thanks a lot for your help. Actually your last formulas are enough to cover the proof and I've already solved that.The problem is, if the equality holds and $n\ge 2$, then all the $\mu_i=0$,which is all I have gone so far. Could you help me with the last step that $S=O$? Your assistance will be highly appreciated. – Ross Ren May 20 '21 at 10:08
  • @RossRen I'm sorry, could you rephrase the question ? I don't understand what you are trying to prove – jacopoburelli May 20 '21 at 10:10
  • OK. Using $AX=\lambda X$ and your last formulas, I proved the main proposition.I only have trouble in the special case when the equality is achieved. I mean, how to prove $S=O$ when I only knew all the eigenvalues are zeros. – Ross Ren May 20 '21 at 10:16
  • @RossRen Look at the Addendum and tell me what you think – jacopoburelli May 20 '21 at 10:37
  • I've got your points.Thanks a lot. And sorry again for my ambitious problem. I only knew that using congruent transformation, we can get $diag${$S,...,S,0,...,0$} where $S$ denotes\begin{matrix}
     0 & 1 \\
    
     -1 & 0
    
    

    \end{matrix} , can you only use this to solve th e problem so that I could finish the proof before learning about orthogonal matrices.

    – Ross Ren May 20 '21 at 10:54
  • To sum up, we may not use similarity. – Ross Ren May 20 '21 at 11:02
2

Actually I only have trouble in proving that if and only if all the eigenvalues of is 0 , then =.

since OP is stuck on this point and does not want to use orthogonality, here's a different finish.

It is immediate that $S=\mathbf 0\implies $ all eigenvalues of $S$ are 0.

We need to prove the other direction, i.e. having all eigenvalues (considered in $\mathbb C$) of real skew symmetric $S$ be 0 $\implies S=\mathbf 0$

$\text{trace}\big(S^2\big) = \sum_{k=1}^n \lambda_k^2 = \sum_{k=1}^n 0^2=0$
but $S^2=-S^TS\implies 0= \text{trace}\big(S^TS\big)=\Big\Vert S\Big \Vert_F^2 \implies S=\mathbf 0$

user8675309
  • 10,034