2

I came across this question in my research. If I have a p by p matrix $X$, with constant $c$

$X_{p\times p} = I_{p\times p} + c\mathbf{1}_{p\times p} - c diag(\mathbf{1})$,

how do I analytically compute the eigenvalues of this special matrix? In words, this matrix $X$ has ones on the diagonals and constant $c$ on off diagonals.

I did some experiment on wolfram alpha with p=2,3,4 and it looks like I always get e = 1, e = 1 + (p-1). So I'd suspect there is a straightforward analytical formula I can use to derive the eigenvalues. Could anyone shed some light?

My google search (https://stats.stackexchange.com/questions/13368/off-diagonal-range-for-guaranteed-positive-definiteness) revealed more general version of the question but the answer there doesn't explain how to analytically derive the eigenvalues. There, the question is for random values of $c_{i,j}$ not constant $c$.

1 Answers1

1

You can just write down the eigenvectors.

For instance, the vector $(1,1,\ldots, 1)$ is transformed into $1 + cp$ times itself, so $1 + cp$ is an eigenvalue.

Similarly, the vector $v_j = e_1 - e_j$ ($j = 2, \ldots, p$)$ is an eigenvector of eigenvalue 1.

So the eigenvalues are:

  1. $\lambda = 1$, with multiplicity $p-1$
  2. $\lambda = 1 + cp$, with multiplicity 1.

WAIT! You changed the question while I was writing the answer, from $$ X_{p\times p} = I_{p\times p} + c\mathbf{1}_{p\times p} $$ to $$ X_{p\times p} = I_{p\times p} + c\mathbf{1}_{p\times p} - c \text{diag}(\mathbf{1}) $$

So my answer is to a slightly different question; I leave it to you to adjust things to make it work for your modified question.

Answer to modified question Your matrix can be rewritten as

$$ M = (1 - c) I + c 1_{p \times p} $$ I'm going to find the eigenvalues by inspection, by just looking at the matrix and trying to guess some eigenvectors. In the first place, all row-sums are the same, so the vector of all 1s is an eigenvector. The associated eigenvalue is the row-sum, which is $1 + (p-1)c$. (That happens to correspond to what you claimed in your problem, but I didn't use your claim -- indeed, I didn't trust it after you changed the problem on me! -- to find the eigenvector.)

What about the other eigenvectors/values? Well, your matrix is symmetric, so eigenspaces are orthogonal, so all other eigenvectors are perpendicular to $(1, 1, \ldots, 1)$. The first vector that jumps to mind is $(1, -1, 0, 0, \ldots, 0) = e_1 - e_2$. When we multiply this by $M$, we get $(1-c)(e_1 - e_2)$, which shows that it's an eigenvector for $1-c$; it's immediately clear, having dome the multiplication by hand (I'm not going to write it out here) that $e_1 -e_3$, $e_1 - e_4$, etc., are all also eigenvectors for $1-c$. Since there are $p-1$ of these, we're done.

So my revised answer is:

  1. $\lambda = 1 + c(p-1)$, with multiplicity 1.
  2. $\lambda = 1-c$, with multiplicity $p-1$

One more alternative: you could take

$$ h(x) = det (M - xI) $$ and solve the equation $h(x) = 0$.

Since $M = (1-c) I + Q$, where $Q$ is singular (indeed, has rank 1), it's clear that $h(c) = 0$, indeed, that $h(c) = 0$ to order $p-1$. So $$ h(x) = (1-c)^{p-1} (ax + b) $$ for some unknown $a$ and $b$. The "rows sum to a constant" trick tell you that for $x_0 = 1 + c(p-1)$, we have $h(x_0) = 0$ as well, which gives us all the roots of $h$.

John Hughes
  • 93,729
  • Hi I changed the question to more simple form (I didn't realize the mistake in the original question). The matrix has ones on the diagonal and constant c on off diagonals. I don't understand in detail what you mean by transform. But what I think I get is that you want to derive eigenvectors assuming you know the eigenvalues. However, I'm interested in how to get the eigenvalues without assuming I already know the answer. – Ben Jackson Feb 16 '15 at 20:51
  • By "transform" I mean the linear map $v \mapsto Mv$ defined by the given matrix. You're mistaken that I found the eigenvectors knowing the eigenvalues: I went the other direction. I saw obvious eigenvectors, and from these derived the eigenvalues. But I've also given an alternative analysis that may make you happier. – John Hughes Feb 16 '15 at 21:48
  • Respect. Thank you sir for the explanation. One small follow up question. I tried different combinations for eigenvectors that go with the eigenvalue of $\lambda = 1 - c$ but wasn't successful. For this type of problems, what's the best practice in finding such eigenvectors (in this case, $e_i - e_{i+1}$)? – Ben Jackson Feb 17 '15 at 20:14
  • Well...the fact that eigenvectors for distinct eigenvalues of a symmetric matrix have to be perpendicular told me that I needed to look for things orthogonal to $(1,1,1,\ldots, 1)$, so putting a "1" and a "$-1$" in two separate entries seemed like a good bet. I also looked at the parts of $M$: a scalar multiple of the identity, and a constant matrix. Since ANYTHING is an eigenvector of the "identity part", all I had to do was find a vector that the "constant matrix" sent to zero...and once again, one with a $+1$ and a $-1$ seemed like an obvious choice. Honestly, though? Experience. Sorry. :( – John Hughes Feb 17 '15 at 21:14