0

Let $A \in \mathbb{R}^{n \times n}$ a positive definite matrix and consider its skew symmetric part $$S=\frac{A-A^T}{2}$$ What results can you use to localise the eigenvalues of $S$ in the complex plane?


I know that $S$ is skew symmetric, hence it has only purely imaginary eigenvalues. But in this way I'm not using the positive definiteness of $A$. I was thinking about the Rayleigh quotient: let $u$ an eigenvector associated to $\lambda \in \sigma(S)$.

$\lambda = u^T S u = \frac{1}{2} (u^T A u - u^T A^Tu)$

but the last term is $0$ because $(u^TAu)^T = u^TA^Tu$

How should I move?

andereBen
  • 695
  • 3
  • 13
  • Hint: https://math.stackexchange.com/a/1964244/339790 – Rodrigo de Azevedo Jul 26 '20 at 15:25
  • If positive definiteness means only $x^TAx\geq 0$ (...), then one way is to use $iS$ is Hermitian. – user10354138 Jul 26 '20 at 15:25
  • What exactly do you mean by "localize the eigenvalues"? – Ben Grossmann Jul 26 '20 at 15:27
  • @BenGrossmann I mean: "tell where they are located in the complex plane". It is taken from an exam text – andereBen Jul 26 '20 at 15:28
  • @user10354138 Yes, but I can't find a way to infer something about the imaginary part of the eigenvalues. – andereBen Jul 26 '20 at 15:29
  • @RodrigodeAzevedo the matrix $A$ needs not to be symmetric – andereBen Jul 26 '20 at 15:30
  • @andereBen It would be helpful if you could include the full exam question, so that we could have some context. "What results can we use" depends on what we know about $A$. As Rodrigo indicates, if all we know is that $A$ is positive definite then we can't say anything at all. – Ben Grossmann Jul 26 '20 at 15:30
  • No, you infer the real part of eigenvalues of $iS$, which therefore tells you about the imaginary part of eigenvalues of $S$. – user10354138 Jul 26 '20 at 15:30
  • @BenGrossmann the text is exactly that one, so it is not symmetric. – andereBen Jul 26 '20 at 15:31
  • @andereBen Technically, it is the quadratic form that is positive definite, not the matrix, and the skew-symmetric part of the matrix contributes zero to the quadratic form. – Rodrigo de Azevedo Jul 26 '20 at 15:32
  • @andereBen I understand that the matrix might not be symmetric. I don't understand what leads you to believe otherwise – Ben Grossmann Jul 26 '20 at 15:33
  • @user10354138 could you exand your hint? I can't figure out how to use that $A$ has all positive eigenvalues – andereBen Jul 26 '20 at 15:33
  • @BenGrossmann I'm sorry, I misread your comment. I think that the way is the one user10354138 pointed out, but I can't go further – andereBen Jul 26 '20 at 15:35
  • 1
    @andereBen As I said, it would be helpful if you could include the full exam question, i.e. the entirety of the exam question. As the question stands, it is not clear what you want or why you want it. For example, you indicate in the comments that we are supposed to somehow use the fact that $A$ is positive definite, but there is nothing from the question itself that implies that we can (or should) make use of this fact. – Ben Grossmann Jul 26 '20 at 15:38
  • @BenGrossmann This is the full exam question. I just thought I should use it because it was the only thing I know about $A$. – andereBen Jul 26 '20 at 15:41
  • @andereBen Sorry for my insistence, then. It is strange to see such an open-ended question on a math (or applied math, or engineering) exam. – Ben Grossmann Jul 26 '20 at 15:42
  • @BenGrossmann No worries, I'm puzzled as you honestly :-) – andereBen Jul 26 '20 at 15:45

1 Answers1

2

The fact that $A$ is positive gives us no information about $S$. In particular: if $B$ were some arbitrary matrix, then we could select a $k>0$ that is sufficiently large to ensure that $A + kI = B$ is positive definite, but the resulting skew-symmetric parts $\frac 12(A + A^T)$ and $\frac 12 (B + B^T)$ would be the same.

So with that said, the question is simply that of finding the eigenvalues of a skew symmetric matrix. I will stick to the case where $n = 2m$ is even for ease of discussion; the situation is essentially the same for the odd case, but $S$ is guaranteed to have at least one "extra" $0$ eigenvalue.

As suggested in the comments, we could use the fact that $S$ is skew-symmetric, which means that $iS$ is Hermitian. Because the eigenvalues of $S$ have the form $\pm \lambda_1 i, \pm \lambda_2 i, \dots , \pm \lambda_m i$ for $\lambda_1 \geq \cdots \geq \lambda_m \geq 0$, it follows that the matrix $iS$ is Hermitian with eigenvalues $\pm \lambda_1,\dots,\pm \lambda_m$. Thus, we could use the Rayleigh quotient to find this eigenvalues. In particular, we have $$ \lambda_1 = \max_{x \in \Bbb C^n, \|x\| = 1} x^* (iS)x, $$ where $x^*$ denotes the conjugate-transpose of $x$. Once we found the vector $x_1$ for which $Sx_1 = \lambda_1 i x_1$, it immediately follows that $$ S \bar x_1 = \overline{S x_1} = \overline{\lambda_1 i x_1} = -\lambda_1 i \bar x_1, $$ so we get two eigenvalues for the price of one. We could proceed iteratively in several ways from there. For instance, we have $$ \lambda_2 = \max_{x \in \Bbb C^n, \|x\| = 1, x \perp x_1,\bar x_1} x^*(iS)x, $$ and so forth. Equivalently, we can work with real numbers: we have $$ \lambda_1 = \max_{x \in \Bbb R^{2n}, \|x\| = 1}x^T \pmatrix{0 & -S\\ S & 0} x. $$ There are other methods as well. For instance, we could use $S^2$ instead of $iS$, noting that $S^2$ is symmetric and negative semidefinite, and its eigenvalues have the form $-\lambda_1^2,\dots,-\lambda_m^2$, each with multiplicity two. Also, instead of using the Rayleigh quotient, we could apply other algorithms such as the Jacobi algorithm or the QR algorithm.

Ben Grossmann
  • 225,327