Given the $2\ \times\ 2 $ matrix $$\boldsymbol{A}=\begin{pmatrix}4 & -2\\\ 1 & 1\end{pmatrix}$$
I am aware that eigenvalues of matrices can be solved from the usual $p(\lambda)=|\boldsymbol{A}-\lambda I|=0$ However, it is also known that eigenvalues can be found using the trace and determinant of the matrix $\boldsymbol{A}$ where: $$λ_1+λ_2=Tr(\boldsymbol{A}) $$ $$λ_1λ_2=\boldsymbol{|A|}$$
Using the above to solve for the eigenvalues of $\boldsymbol{A}:$
$$λ_1+λ_2=5$$$$λ_1λ_2=6$$ Now: $$\lambda_1=5-\lambda_2$$ Using $\lambda_1$ to solve for $\lambda_2$ yields: $$\lambda_2\left(5-\lambda_2\right)=6$$$$\lambda_2^2-5\lambda_2^2+6=0$$ Solving for $\lambda_2$ gives two solutions: $$\lambda_{2A}=3$$$$\lambda_{2B}=2$$ Using both $\lambda_{2A}$ and $\lambda_{2B}$ to solve for $\lambda_1$: $$\lambda_{1A}=2$$ $$\lambda_{1B}=3$$ After selecting distinct values for the eigenvalues, I have $\lambda_1=2\ \ \ \lambda_2=3$
My question is: Why is the solution for $\lambda_2:$ $\lambda_{2A}$ and $\lambda_{2A}$, provides the complete distinct eigenvalues for $\boldsymbol{A}$? This can be also said to the solutions of $\lambda_1:$ $\lambda_{1A}$ and $\lambda_{1A}$
Also, is there a theorem for the uniqueness of eigenvalues that is similar to this:
Given an $N\ \times\ N$ matrix $\boldsymbol{A}$, if $\boldsymbol{S}_1$ is a complete set of $n$ distinct eigenvalues, and $\boldsymbol{S}_2$ is also a complete set of distinct eigenvalues for $\boldsymbol{A}$, then $\boldsymbol{S}_1=\boldsymbol{S}_2$