I'm recently playing with some $(n+k)\times (n+k)$ block matrices: \begin{equation*} \mathbf{X} = \begin{pmatrix} \mathbf{0}_{n\times n} & \mathbf{P}_{n\times k} \\ {\mathbf{P}}_{k\times n}^T & \mathbf{0}_{k\times k} \end{pmatrix} \end{equation*} where $\mathbf{P}_{n\times k}$ consists of $k$ orthonormal basis in $\mathbb{R}^n$ so that $\mathbf{P}^T\mathbf{P} = \mathbf{I}_{k\times k}$. After playing with Matlab, I conjectured that $\mathbf{X}$ has a very special set of eigenvalues. \begin{equation*} \text{Eigenvalues of $\mathbf X$} = \text{$k$ copies of -1, $k$ copies of +1, and $n-k$ copies of 0.} \end{equation*} There can be many ways to prove this, but I tried to compute the characteristic polynomial $p_{\mathbf{X}}(\lambda)$ of $\mathbf{X}$
\begin{equation*} p_{\mathbf{X}}(\lambda) = \det(\lambda\mathbf{I} - \mathbf{X}) = \det \begin{pmatrix} \lambda\mathbf{I}_n & -\mathbf{P} \\ -\mathbf{P}^T & \lambda\mathbf{I}_k \end{pmatrix} \end{equation*}
to show that $p_{\mathbf{X}}(\lambda) = (\lambda^2-1)^k\lambda^{n-k}$ which will prove my conjecture because the roots of a characteristic polynomial $p_{\mathbf{X}}(\lambda)$ are the eigenvalues of $\mathbf{X}$.
Maybe I can exploit Schur determinant identity which states that \begin{equation*} \det \begin{pmatrix} \mathbf{A} & \mathbf{B} \\ \mathbf{C} & \mathbf{D} \end{pmatrix} = \det\mathbf{A}\cdot \det\left( \mathbf{D} - \mathbf{C}\mathbf{A}^{-1}\mathbf{B} \right) \end{equation*} assuming $\mathbf A$ is invertible. Well, if I can assume $\lambda \neq 0$, then I can easily compute the characteristic polynomial using this identity. \begin{align*} \det \begin{pmatrix} \lambda\mathbf{I}_n & -\mathbf{P} \\ -\mathbf{P}^T & \lambda\mathbf{I}_k \end{pmatrix} &= \det(\lambda\mathbf{I}_n)\cdot \det\left( \lambda\mathbf{I}_k - \mathbf{P}^T(\lambda\mathbf{I}_n)^{-1}\mathbf{P} \right) \\ &= \det(\lambda\mathbf{I}_n)\cdot\det \left[\left(\lambda - \frac{1}{\lambda}\right)\mathbf{I}_k\right] \\ &= \lambda^n\left(\lambda - \frac{1}{\lambda}\right)^k \\ &= \lambda^{n-k}(\lambda^2 - 1)^k \end{align*}
This seems the right answer but it only holds when $\lambda\neq 0$. How can I show that this indeed holds even when $\lambda = 0$? Maybe I can try the cofactor expansion for determinant but that seems not easy for this problem.
I once heard some people saying "the invertible matrices are dense in the space of all matrices, and we can conclude the general case holds as well by the continuity argument" in their papers but I cannot truly understand what they're saying. Can I just repeat what they said for my proof?
This is just an additional question but where can I find some explanation on such a continuity argument?