Let $A=(1-a)I_n + a J_n$. Find the values of $a$ so the matrix is p.d.
Note: $I_n$ is the identity matrix and $J_n$ is the $1's$ matrix.
I know that $A$ is p.d. iff $λ_i >0$ so, I need to choose $a$ so that $~λ_i >0$.
First I tried to use the definition to find the eigenvalues: $\det(A-λI_n)=0$, but it's real complicated.
Any help would be appreciated.
Update: Since the question was from a Statistics book (A First Course in Linear Model Theory by Nalini Ravishanker and Dipak K. Dey, Chapter 2) I have also posted the question here.