0

I want to prove the following: Every symmetric matrix whose entries are calculated as $ 1/(n -1) $ with $n$ as the size of the matrix, except for the diagonal which is 0, has a characteristic polynomial with a root at $x=1$. In other words, every such matrix has an eigenvalue of 1.

For example Matrix 1:

\begin{array}{ccc} 0 & \frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & 0 & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} & 0 \\ \end{array}

has a characteristic polynomial: $f(x)=-x^3+\frac{3 x}{4}+\frac{1}{4} $ ,which has a root at $x=1$

Matrix 2:

\begin{array}{cccc} 0 & \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & 0 & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & 0 & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} & 0 \\ \end{array}

has a characteristic polynomial: $f(x)=-(1/27) - (8 x)/27 - (2 x^2)/3 + x^4 $ ,which also has a root at $x=1$

Matrix 3:

\begin{array}{ccccc} 0 & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & 0 & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & 0 \\ \end{array}

has a characteristic polynomial: $ f(x)=(4 + 60 x + 320 x^2 + 640 x^3 - 1024 x^5)/1024 $ ,which also has a root at $x=1$

I want to show that this is true for any such n by n matrix, i.e. for all n.

Looking for some tips and tricks on how to approach this.

holistic
  • 1,039
  • For example, the matrix with halves is actually $1/2 J - 1/2 I$, where $J$ is the matrix with all entries $1$,and $I$ is the $3 \times 3$ identity matrix. Once you realize what the eigenvalues of $J$ are (it's obvious because it's a rank one matrix) then you can use that post to get your conclusion, which is true. – Sarvesh Ravichandran Iyer Mar 20 '22 at 10:33

3 Answers3

5

The vector with all entries equal to $1$ is an eigenvector with eigenvalue $1$.

  • Thanks. I can't completely follow your reasoning here. How to show, that this is true for every n by n matrix? – holistic Mar 16 '22 at 14:03
  • 1
    Because when you multiply your $n \times n$ matrix by the vector with all entries equal to $1$, each component of the resulting vector is $\frac{1}{n-1} + \dots + \frac{1}{n-1}$, which has $n-1$ terms, and hence equals $1$. – Diego Artacho Mar 16 '22 at 14:05
  • Ah I understand now, thank you! – holistic Mar 16 '22 at 14:16
2

Let $A_n$ be the $n\times n$ matrix such as the off-diagonal entries are $1/(n-1)$ and 0 on the diagonal. Then $A_n$ can be written as

$$A_n=\dfrac{1}{n-1}(\mathbf{1}_n\mathbf{1}_n^T-I_{n})$$

where $\mathbf{1}_n$ is the vector of ones of dimension $n$. This matrix is the perturbation of a full-rank matrix $I_n$ by a rank-one matrix $\mathbf{1}_n\mathbf{1}_n^T$. Let $S\in\mathbb{R}^{n\times(n-1)},\tilde S\in\mathbb{R}^{n\times 1}$ be such that $\mathbf{1}^TS=0$, $S^TS=I$, $\tilde{S}^T\tilde{S}=1$, and $S^T\tilde{S}$.

We can see that the matrix $P=\begin{bmatrix} S & \tilde S \end{bmatrix}$ is invertible and orthogonal; i.e. $P^{-1}=P^T$.

Therefore, we can perform a basis change on $A_n$. This shows that the matrix $A_n$ is similar to $$P^TA_nP=\dfrac{1}{n-1}\begin{bmatrix} -I_{n-1} & 0\\0 & \tilde S^T(\mathbf{1}_n\mathbf{1}_n^T)\tilde S-1 \end{bmatrix}.$$

Noting that $\tilde{S}=\mathbf{1}_n/\sqrt{n}$ is a valid $\tilde S$, then we get that $\tilde S^T(\mathbf{1}_n\mathbf{1}_n^T)\tilde S-1=n-1.$

This yields that that $A_n$ is similar to $$\dfrac{1}{n-1}\begin{bmatrix} -I_{n-1} & 0\\0 & n-1 \end{bmatrix},$$

or, equivalently, to $$\begin{bmatrix} -\dfrac{1}{n-1}I_{n-1} & 0\\0 & 1 \end{bmatrix}.$$

This means that the matrix $A_n$ has one eigenvalue at 1 with multiplicity 1 and one eigenvalue at $-1/(n-1)$ with multiplicity $n-1$.


Edit. Example in the case $n=3$. Then, we have that

$$A_3=\dfrac{1}{2}(\mathbf{1}_3\mathbf{1}_3^T-I_{3}).$$

In this case we can compute $S$ and $\tilde S$ as

$$S=\begin{bmatrix}\dfrac{-\sqrt{3}}{3} & \dfrac{-\sqrt{3}}{3} \\ \dfrac{\sqrt{3}}{6}+\dfrac{1}{2} & \dfrac{\sqrt{3}}{6}-\dfrac{1}{2} \\ \dfrac{\sqrt{3}}{6}-\dfrac{1}{2} & \dfrac{\sqrt{3}}{6}+\dfrac{1}{2} \end{bmatrix},\ \tilde S=\dfrac{\sqrt{3}}{3}\begin{bmatrix}1\\1\\1 \end{bmatrix}.$$

It is quite tedious to do by hand, but there numerical methods out there that can do that for you.

We then obtain that

$$P^TA_3P=\begin{bmatrix} -\dfrac{1}{2}I_{2} & 0\\0 & -1\end{bmatrix}.$$

KBS
  • 7,114
  • Thanks! That one is a bit more complicated, could you illustrate this with an the example 3 x 3 matrix maybe (not a mathematician, sorry)? Would really be helpful in understanding your proof better. – holistic Mar 16 '22 at 14:29
  • 1
    @holistic The proof is based on the fact that the eigenvalues are invariant when we change basis. So, the idea is just to find a basis where the matrix is diagonal or triangular. In such cases, the eigenvalues will be the diagonal entries. – KBS Mar 16 '22 at 14:35
  • Thanks for elaborating. This makes it more clear now! – holistic Mar 16 '22 at 14:37
  • I have a question about the proof: Is it possible to also conclude that 1 is the spectral radius of the matrix? – holistic Mar 16 '22 at 14:54
  • 1
    @holistic The spectral radius of the matrix is the maximum of the eigenvalues in absolute values. So, yes, 1 is the spectral radius of the matrix $A_n$ for all $n\ge 1$. – KBS Mar 16 '22 at 14:55
1

Note the following lemma:

If the sum of every row of an $ m$ by $m$ matrix $A$ is $k$, then $k$ is an eigenvalue of $A$.

Proof: Observe that $A[1]_{m\times 1}=k[1]_{m\times 1}$, where $[1]_{m\times 1}$ represents a matrix of order $m$ by $1$, whose every entry equals $1$. QED.

In your case, $k=1$.

Koro
  • 11,402