A simpler way is from the definition. Is is easy to show that if $\lambda_1$ is an eigenvalue of the upper diagonal block $A_{1,1}$, with eigenvector $p_1$, (size $n_1$) then it's also an eigenvalue of the full matrix, with the same eigenvector augmented with zeros.
$A_{1,1} \; p_1 = \lambda_1 p_1$ with $p_1 \ne 0 $
So
$$ \left( \begin{matrix} A_{1,1}&A_{1,2} \\ 0 &A_{2,2} \end{matrix} \right)
\left( \begin{matrix} p_1 \\ 0 \end{matrix} \right) =
\left( \begin{matrix} A_{1,1} \; p_1 \\ 0 \end{matrix} \right) =
\left( \begin{matrix} \lambda_1 p_1 \\ 0 \end{matrix} \right) =
\lambda_1 \left( \begin{matrix} p_1 \\ 0 \end{matrix} \right) $$
Hence if $\lambda$ is eigenvalue of $A_{1,1}$ then it's also eigenvalue of $A$. There are $n_1$ (counting multiplicity) such eigenvalues. The same applies to the lower diagonal block $A_{2,2}$. So we have found the $n_1$ + $n_2 = n$ eigenvalues of the full matrix. (Wrong! This only applied to block diagonal matrix - Fixed below)
Suposse now that $\lambda_2$ is eigenvalue of $A_{2,2}$ with eigenvector $p_2$.
If $\lambda_2$ is also eigenvalue of $A_{1,1}$, we have proved above that it's also eigenvalue of $A$. So, let's assume it's not eigenvalue of $A_{1,1}$ - hence $|A_{1,1} - \lambda_2 I|\ne 0$. Now
$$\left( \begin{matrix} A_{1,1}&A_{1,2} \\ 0 &A_{2,2} \end{matrix} \right)
\left( \begin{matrix} x \\ p_2 \end{matrix} \right) =
\left( \begin{matrix} A_{1,1} x + A_{1,2} p_2 \\ \lambda_2 p_2 \end{matrix} \right)
$$
We can make $ A_{1,1} x + A_{1,2} p_2 = \lambda_2 x$ by choosing $x = - (A_{1,1} - \lambda_2 I)^{-1} A_{1,2} \; p_2$; and so we found an eigenvector for $A$ with $\lambda_2$ as eigenvalue.
It this way, we showed that if $\lambda$ is eigenvalue of $A_{1,1}$ or $A_{2,2}$, then it's an eigenvalue of $A$.
To complete the proof, one should show the other way round: that if $\lambda$ is eigenvalue of $A$ then it's eigenvalue of $A_{1,1}$ or $A_{2,2}$. But that's easy:
$$\left( \begin{matrix} A_{1,1}&A_{1,2} \\ 0 &A_{2,2} \end{matrix} \right)
\left( \begin{matrix} x_1 \\ x_2 \end{matrix} \right) =
\left( \begin{matrix} A_{1,1} \; x_1 + A_{1,2} \; x_2 \\ A_{2,2} \; x_2 \end{matrix} \right)
= \left( \begin{matrix} \lambda \; x_1 \\ \lambda \; x_2 \end{matrix} \right)
$$
Now, either $x_2 = 0$ or not. If not, then $\lambda$ is eigenvalue of $A_{2,2}$. If yes,
it's eigenvalue of $A_{1,1}$.