0

In some books/notes, the eigendecomposition of positive definite matrix $\bf A$ is written as

\begin{align*} {\bf A} = {\bf P}^{\bf T}{\bf \Lambda}{\bf P} \Longrightarrow {\bf \Lambda} = {\bf P}{\bf A}{\bf P}^{\bf T} \end{align*} where $\bf \Lambda$ is a diagonal matrix whose diagonal elements are the eigenvalues of ${\bf A}$, while in some other books/notes, the reversed:

\begin{align*} {\bf A} = {\bf P}{\bf \Lambda}{\bf P}^{\bf T} \Longrightarrow {\bf \Lambda} = {\bf P}^{\bf T}{\bf A}{\bf P} \end{align*} is used. I tried both formulas using a symmetric matrix, for example,

\begin{align*} {\bf A} = \left[\begin{array}{cc} 3 & 2 \\ 2 & 0 \end{array}\right] \end{align*} with eigenvalues $\lambda_{1,2}=-1,4$ (which means it is not positive definite) and eigenvectors $(1,-2)^{\bf T}$ and $(1,1/2)^{\bf T}$, respectively, and found out the formulas are not equivalent since (using normalised eigenvectors)

\begin{align*} {\bf P}{\bf A}{\bf P}^{\bf T} &= \frac{1}{\sqrt{5}}\left[\begin{array}{cc} 1 & 2 \\ -2 & 1 \end{array}\right] \left[\begin{array}{cc} 3 & 2 \\ 2 & 0 \end{array}\right] \frac{1}{\sqrt{5}} \left[\begin{array}{cc} 1 & -2 \\ 2 & 1 \end{array}\right] = \left[\begin{array}{cc} 11/5 & -12/5 \\ -12/5 & 4/5 \end{array}\right] \\ {\bf P}^{\bf T}{\bf A}{\bf P} &= \frac{1}{\sqrt{5}} \left[\begin{array}{cc} 1 & -2 \\ 2 & 1 \end{array}\right] \left[\begin{array}{cc} 3 & 2 \\ 2 & 0 \end{array}\right] \frac{1}{\sqrt{5}}\left[\begin{array}{cc} 1 & 2 \\ -2 & 1 \end{array}\right] = \left[\begin{array}{cc} -1 & 0 \\ 0 & 4 \end{array}\right] \end{align*}

When I took a positive definite matrix, \begin{align*} {\bf A} = \left[\begin{array}{cc} 3 & 1 \\ 1 & 3 \end{array}\right] \end{align*} with eigenvalues $\lambda_{1,2}=2,4$ and eigenvectors $(1,-1)^{\bf T}$ and $(1,1)^{\bf T}$, respectively, the formulas are also not equivalent since (again using normalised eigenvectors)

\begin{align*} {\bf P}{\bf A}{\bf P}^{\bf T} &= \frac{1}{\sqrt{2}}\left[\begin{array}{cc} 1 & 1 \\ -1 & 1 \end{array}\right] \left[\begin{array}{cc} 3 & 1 \\ 1 & 3 \end{array}\right] \frac{1}{\sqrt{2}} \left[\begin{array}{cc} 1 & -1 \\ 1 & 1 \end{array}\right] = \left[\begin{array}{cc} 4 & 0 \\ 0 & 2 \end{array}\right] \\ {\bf P}^{\bf T}{\bf A}{\bf P} &= \frac{1}{\sqrt{2}} \left[\begin{array}{cc} 1 & -1 \\ 1 & 1 \end{array}\right] \left[\begin{array}{cc} 3 & 1 \\ 1 & 3 \end{array}\right] \frac{1}{\sqrt{2}}\left[\begin{array}{cc} 1 & 1 \\ -1 & 1 \end{array}\right] = \left[\begin{array}{cc} 2 & 0 \\ 0 & 4 \end{array}\right] \end{align*}

When I took another positive definite matrix, \begin{align*} {\bf A} = \left[\begin{array}{ccc} 2 & -1 & 0 \\ -1 & 2 & 0 \\ 0 & 0 & 2 \\ \end{array}\right] \end{align*} with eigenvalues $\lambda_{1,2,3}=1,2,3$ and eigenvectors $(1,1,0)^{\bf T}$, $(0,0,1)^{\bf T}$ and $(1,-1,0)^{\bf T}$, respectively, the output is difference since

\begin{align*} {\bf P}{\bf A}{\bf P}^{\bf T} &= \left[\begin{array}{ccc} 1/\sqrt{2} & 0 & 1/\sqrt{2} \\ 1\sqrt{2} & 0 & -1/\sqrt{2} \\ 0 & 1 & 0 \\ \end{array}\right] \left[\begin{array}{ccc} 2 & -1 & 0 \\ -1 & 2 & 0 \\ 0 & 0 & 2 \\ \end{array}\right] \left[\begin{array}{ccc} 1/\sqrt{2} & 1/\sqrt{2} & 0 \\ 0 & 0 & 1 \\ 1\sqrt{2} & -1/\sqrt{2} & 0 \\ \end{array}\right] = \left[\begin{array}{ccc} 2 & 0 & -1/\sqrt{2} \\ 0 & 2 & -1/\sqrt{2} \\ -1/\sqrt{2} & -1/\sqrt{2} & 2 \\ \end{array}\right] \end{align*}

\begin{align*} {\bf P}^{\bf T}{\bf A}{\bf P} &= \left[\begin{array}{ccc} 1/\sqrt{2} & 1/\sqrt{2} & 0 \\ 0 & 0 & 1 \\ 1/\sqrt{2} & -1/\sqrt{2} & 0 \\ \end{array}\right] \left[\begin{array}{ccc} 2 & -1 & 0 \\ -1 & 2 & 0 \\ 0 & 0 & 2 \\ \end{array}\right] \left[\begin{array}{ccc} 1/\sqrt{2} & 0 & 1/\sqrt{2} \\ 1/\sqrt{2} & 0 & -1/\sqrt{2} \\ 0 & 1 & 0 \\ \end{array}\right] = \left[\begin{array}{ccc} 1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 3 \\ \end{array}\right] \end{align*}

So, it seems that I can conclude that ${\bf A} = {\bf P}{\bf \Lambda}{\bf P}^{\bf T} \Longrightarrow {\bf \Lambda} = {\bf P}^{\bf T}{\bf A}{\bf P} $ is correct and it is not equivalent with ${\bf A} = {\bf P}^{\bf T}{\bf \Lambda}{\bf P} \Longrightarrow {\bf \Lambda} = {\bf P}{\bf A}{\bf P}^{\bf T}$. Am I missing something that has caused me to get different answers for both formulas?

mohd
  • 57
  • 1
    The formulas are not equivalent effectively. The other formula is correct, if by convention we assume that the eigenvectors are the rows of $P$ and not the columns. Personnaly, I was only knowing and using $A = P \Lambda P^T$ formula, assuming here that the eigenvectors are the colums of $P$ – Damien May 27 '20 at 15:19
  • 1
    And likewise, I have only ever known the second formulation! IMO I think that the second formulation should be used as it is then compatible with the usual presentation of the Schur decomposition $A = UTU^*$. For normal (including symmetric) matrices, this reduces to the eigendecomposition of $A$ in the second formulation – whpowell96 May 27 '20 at 15:23
  • 1
    I hope you realize that the matrix $P$ in $PAP^T$ is not the same matrix as the $P$ in $P^TAP$. As other comments have noted, in one the eigenvectors are rows, on the other they’re columns. Use a different name for one of them if that’s confusing you. – amd May 27 '20 at 20:05
  • thanks very much. It's silly that I didn't realise that the eigenvectors can be written in rows and columns. thanks very much! – mohd May 28 '20 at 01:18

0 Answers0