Iām taking Graph Theory lecture, and in the assignment there is a question about finding the determinant of a matrix $J_n-(1+\lambda)I_n$ where $J_n = (1)_{n \times n}$ and $I_n$ is the identity matrix of order $n$.
I have seen this problem before in the Linear Algebra course, and solved it by computing eigenvalues of the given matrix. However, since the final goal of this problem is finding eigenvalues of a complete graph $K_n$ by using this, it seems to be better to use different method.
I tried to use induction on the size of the matrix, but it does not seem to work well. I would be glad if someone give me a little bit of hint about this question.