1

Let $A=(1-a)I_n + a J_n$. Find the values of $a$ so the matrix is p.d.
Note: $I_n$ is the identity matrix and $J_n$ is the $1's$ matrix.

I know that $A$ is p.d. iff $λ_i >0$ so, I need to choose $a$ so that $~λ_i >0$.

First I tried to use the definition to find the eigenvalues: $\det(A-λI_n)=0$, but it's real complicated.

Any help would be appreciated.

Update: Since the question was from a Statistics book (A First Course in Linear Model Theory by Nalini Ravishanker and Dipak K. Dey, Chapter 2) I have also posted the question here.

user26857
  • 52,094
  • See https://math.stackexchange.com/questions/2853981/find-the-eigenvalues-and-their-multiplicities-of-a-special-matrix?rq=1 and other related questions for how to compute the eigenvalues of such a matrix. What condition must they satisfy for $A$ to be p.d.? – amd Feb 10 '19 at 00:11

1 Answers1

2

You can use Sylvester's Criterion.

Let $A_n = (1-a)I_n + aJ_n$. Then $A_n$ is positive definite iff. all determinants $\det A_1, \det A_2, \ldots \det A_n$ are positive.

You can show by induction, that

$$ \det A_{n+1} = \det A_n \left(1 - \frac{na^2} {(n-1)a + 1} \right) $$

Given $A_n$ is positive, to show that $A_{n+1}$ is positive too, you need to check when $1 > \frac {na^2} {(n-1)a + 1}$.

enedil
  • 1,710