5

let $$A = \left(\begin{array}{cccc} 1&2&3\\2&3&4\\3&4&5 \end{array}\right)$$

I need to find an invertible matrix $P$ such that $P^tAP$ is a diagonal matrix and it's main diagonal may have only the terms from the set $\{ 1,-1,0 \}$

I'd be glad if you could explain to me how to solve this. I haven't found the right theorem/algorithm.

Thanks.

Jack D'Aurizio
  • 353,855
jmiller
  • 637

3 Answers3

10

This trick is due to Hermite. It is especially useful when you have a symmetric matrix of integers. First, we write a certain function in three variables, $$ f(x,y,z) = x^2 + 3 y^2 + 5 z^2 + 8 yz+ 6 zx +4xy, $$ because this is exactly the result of calculating $v^t A v,$ with $$ v = \left( \begin{array}{c} x \\ y \\ z \end{array} \right) $$

In order to clear the terms with $x,$ we write $$ (x + 2 y + 3 z)^2 = x^2 + 4 y^2 + 9 z^2 + 12 yz+ 6 zx +4xy. $$ So far, $$ f(x,y,z) - (x + 2 y + 3 z)^2 = -y^2 - 4 z^2 - 4 y z. $$ Next we clear all $y,$ $$ (y + 2 z)^2 = y^2 + 4 z^2 + 4 y z, $$ and $$ f(x,y,z) - (x + 2 y + 3 z)^2 + (y + 2 z)^2 = 0, $$ $$ \color{red}{ f(x,y,z) = (x + 2 y + 3 z)^2 - (y + 2 z)^2 }. $$

The matrix multiplication that this shows is $$ \left( \begin{array}{ccc} 1 & 0 & 0 \\ 2 & 1 & 0 \\ 3 & 2 & 0 \end{array} \right) \left( \begin{array}{ccc} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 0 \end{array} \right) \left( \begin{array}{ccc} 1 & 2 & 3 \\ 0 & 1 & 2 \\ 0 & 0 & 0 \end{array} \right) = \left( \begin{array}{ccc} 1 & 2 & 3 \\ 2 & 3 & 4 \\ 3 & 4 & 5 \end{array} \right) $$ That is actually the right way to do it.

However, I see that someone asked for invertible $P,$ despite non-full rank. Also can be done, and easily: $$ \left( \begin{array}{ccc} 1 & 0 & 0 \\ 2 & 1 & 0 \\ 3 & 2 & 1 \end{array} \right) \left( \begin{array}{ccc} 1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 0 \end{array} \right) \left( \begin{array}{ccc} 1 & 2 & 3 \\ 0 & 1 & 2 \\ 0 & 0 & 1 \end{array} \right) = \left( \begin{array}{ccc} 1 & 2 & 3 \\ 2 & 3 & 4 \\ 3 & 4 & 5 \end{array} \right) $$ The effect of this is to add $0$ to the function in the shape of $0z^2.$

ADDED: looking at the question again, we do need the extra $1$ in the lower right, because the matrix $P$ requested is actually the inverse of the on I display above. Life goes on.

Will Jagy
  • 139,541
  • 1
    That's really nice, I've never heard about that before. Still, for the sake of a more general result,one could at least mention the spectral theorem. A bigger picture certainly doesn't harm... – user190080 Jul 23 '15 at 19:37
2

I never remember exactly how this goes, so let's think about it.

Since $1,-,1,0$ are the eigenvalues of $A$, there exists vectors $v_1,v_2,v_3$ such that $$ Av_1=v_1,\ \ Av_2=-v_2,\ \ Av_3=0. $$ Also, the fact that $A$ is symmetric guarantees that $$ v_1^tv_2=v_1^tv_3=v_2^tv_3=0. $$ If we take $P=\begin{bmatrix}v_1&v_2&v_3\end{bmatrix}$, then $$ AP=\begin{bmatrix}Av_1&Av_2&Av_3\end{bmatrix}=\begin{bmatrix}v_1&-v_2&0\end{bmatrix} $$ and $$ P^tAP=\begin{bmatrix}v_1^t\\ v_2^t\\ v_3^t\end{bmatrix}\,\begin{bmatrix}v_1&-v_2&0\end{bmatrix} =\begin{bmatrix} v_1^tv_1&-v_1^tv_2&0\\ v_2^tv_1&-v_2^tv_2&0\\ v_3^tv_1&-v_3^tv_2&0 \end{bmatrix} =\begin{bmatrix} v_1^tv_1&0&0\\ 0&-v_2^tv_2&0\\ 0&0&0 \end{bmatrix} $$ So if we take $v_1$ and $v_2$ to be unit vectors, then $P=\begin{bmatrix}v_1&v_2&v_3\end{bmatrix}$ is the matrix you are looking for.

Martin Argerami
  • 205,756
-1

Hint

Please follow the procedure, outlined here with an example: http://www.sosmath.com/matrix/diagonal/diagonal.html

Update your question with the results of your work and post a comment to this response. I will be happy to check your work for you.

gt6989b
  • 54,422