4

Let $M$ be the $n \times n$ real symmetric matrix with entries $m_{ij}$ given by $$ m_{ij} = r^{|i-j|} - r^{2(n+1) - (i+j)}, $$ for some $r \in [0, 1]$.

Numerically, it can be verified that this matrix is positive definite, however I struggled to show that this is the case by hand. I am mainly interested in computing the eigenvalues of this matrix. I know that the trace of this matrix has the form $$ n - \frac{1 - r^{2(n+1)}}{1 - r^2}, $$ but so far this is really the most information I could figure out about this matrix.

This matrix arises in the study of Markov chains.

Drew Brady
  • 3,399
  • 1
    Could you tell us what is the origin of this matrix and why you are interested in the fact it is positive definite ? – Jean Marie Feb 20 '23 at 22:11
  • This is a rank-one update of the inverse of a symmetric tridiagonal matrix. It isn't difficult to show that the matrix is positive definite, but I don't think there is any closed-form formula for the eigenvalues. – user1551 Feb 20 '23 at 22:26
  • @user1551 Can you expand on this comment? What is the rank-one update, what is the underlying symmetric tridiagonal matrix that is being inverted? – Drew Brady Feb 20 '23 at 23:51

2 Answers2

5

We are going to use a classical trick which is the introduction of a specific inner product.

First of all, one can write :

$$m_{ij} = r^{|(n+1-i)-(n+1-j)|} - r^{(n+1-i)+(n+1-j)}$$

Setting $i'=n+1-i$, $j'=n+1-j$, and dropping the primes, we can consider that the general entry is now (see remark below) :

$$m_{ij} = r^{|i-j|} - r^{i+j}\tag{1}$$

In fact, for normalization purposes, it will be simpler to work with :

$$m_{ij}=\frac{\pi}{4}( r^{|i-j|} - r^{i+j}) \tag{1'}$$

which doesn't change the sign of the eigenvalues of this matrix.

Now, the important fact : (1') can be written like this :

$$m_{ij}=\int_0^\infty \sin(pix)\sin(pjx)\underbrace{\frac{1}{1+x^2}}_{w(x)}dx\tag{2}$$

Indeed, (2) can be transformed into :

$$\tfrac{1}{2}\int_0^\infty\frac{\cos(p(i-j)x)}{1+x^2}dx-\tfrac{1}{2}\int_0^\infty\frac{\cos(p(i+j)x)}{1+x^2}dx $$

which is equal to (1'), if one takes $p=-\ln r$ as a consequence of the following result :

$$\int_0^\infty\frac{\cos(u x)}{1+x^2}dx=\frac{\pi}{2}e^{-|u|}$$

(which can be seen as a consequence of the fact that $\frac{1}{1+x^2} \leftrightarrow e^{-|u|}$ is a Fourier Transform pair).

What is the interest of (2) ? The fact that $m_{ij}$ can be written under the form of an inner product :

$$m_{ij} \ = \ < \sin(pix) \ , \ \sin(pjx) >$$

with an inner product defined in this way :

$$<f,g> \ = \ \int_0^{\infty}f(x)g(x)w(x) \ \text{where} \ w \ \text{is a weight function.}$$

Therefore, $M$ is the Gram matrix for this inner product and functions $f_k(x)=sin(pkx), k=1\cdots n$. These functions being linearly independent, $M$ is positive-definite (not only semi-definite-positive).

Remark : Why is positive-definiteness preserved by the change of indices we have done at the beginning ? This is due to the following relationship between the new matrix $M'$ and the old one $M$ :

$$M'=JMJ=JMJ^{-1}, \tag{3}$$

where $J$ is the matrix analogous to identity matrix $I$, but with its "1" entries on the antidiagonal (otherwise said, the matrix with $J_{i,j}=\delta(i+j-(n+1))$ where $\delta$ is the Kronecker symbol). Explanatio, : Left- (resp. right-) multiplying by $J$ a matrix generates a row- (resp. column-) reversal.

As a consequence of (3), $M$ and $M'$ have the same eigenvalues, all of them $>0$ ; therefore $M'$ inheritates its def. positiveness from the def. positiveness of $M$.

Jean Marie
  • 81,803
  • Should there be a negative sign in front of the first integral in the display after equation (2)? – Drew Brady Feb 21 '23 at 00:07
  • @Drew Brady : you are perfectly right. I have fixed the error, coming from the use of $\cos$ functions instead of $\sin$ finctions. – Jean Marie Feb 21 '23 at 07:31
  • Function $w(x)=\frac{1}{1+x^2}$ is very common as a pdf (Cauchy distribution with normalisation by $1/\pi$) but there are very few references to it as a weight function restricted to $[0,+\infty)$. Here is one. – Jean Marie Feb 21 '23 at 11:03
  • For a proof of the Fourier pair see an interesting one here – Jean Marie Feb 22 '23 at 21:57
4

When $r=0$, $M=I$ is positive definite. When $r=1$, $M=0$ is only positive semidefinite, not positive definite. Now suppose that $0<r<1$. We write $$ M=A-xx^T, $$ where $a_{ij}=r^{|i-j|}$ and $x=(r^n,r^{n-1},\ldots,r)^T$. One may directly verify that $$ A^{-1}=\frac{1}{s^2-1}\pmatrix{s^2&-s\\ -s&s^2+1&\ddots\\ &\ddots&\ddots&\ddots\\ &&\ddots&s^2+1&-s\\ &&&-s&s^2} $$ where $s=r^{-1}>1$. Then $A^{-1}$ is irreducibly diagonally dominant. Hence $A^{-1}$ and $A$ are positive definite. Consequently, $\det(M)=\det(A-xx^T)=\det(A)(1-x^TA^{-1}x)$ has the same sign as $1-x^TA^{-1}x$. Note that the last column of $A$ is $(r^{n-1},r^{n-2},\ldots,1)^T$, which is precisely $r^{-1}x$. Therefore $$ 1-x^TA^{-1}x=1-x^T(re_n)=1-rx_n=1-r^2>0. $$ Therefore $M$ has a positive determinant, and so do all trailing principal submatrices of $M$, because they have the same form as $M$. Thus $M$ is positive definite, by Sylvester's criterion.

user1551
  • 139,064
  • A very interesting answer from you, once more. For those who haven't yet been introduced to KMS matrix and its inverse here is a reference . – Jean Marie Feb 21 '23 at 12:48